Dynamic page
-
I have few pages on my site that are with this nature
/locator/find?radius=60&zip=&state=FL
I read at Google webmaster that they suggest not to change URL's like this
"According to Google's Blog (link below) they are able to crawl the simplified dynamic URL just fine, and it is even encouraged to use a simple dynamic URL ( " It's much safer to serve us the original dynamic URL and let us handle the problem of detecting and avoiding problematic parameters. " )
_http://googlewebmastercentral.blogspot.com/2008/09/dynamic-urls-vs-static-urls.html _It can also actually lead to a decrease as per this line: " We might have problems crawling and ranking your dynamic URLs if you try to make your urls look static and in the process hide parameters which offer the Googlebot valuable information. "The URLs are already simplified without any extra parameters, which is the recommended structure from Google:"Does that mean I should avoid rewriting dynamic URLs at all?
That's our recommendation, unless your rewrites are limited to removing unnecessary parameters, or you are very diligent in removing all parameters that could cause problems"I would love to get some opinions on this also please consider that those pages are not cached by Google for some reason.
-
I think this is an answer that goes beyond Google. We use rewrites extensively and do not have any problems. There are some caveats
- Regarding GoogleBot missing information, you just need to make sure that the new URL has all the info.
Lets say you are a plumbing portal and use
/locator/find?radius=60&zip=&state=FL
rewrite to
/plumbers/florida-fl/miami/33110/
Your search radius can be a default value vs having to put it in as a parameter.
It helps with site structure to think of things as how they would be as a static directory. In this case, you are actually giving more information to GoogleBot with the rewritten URL vs the old one as you have included who you are searching for (a plumber) the city (miami), state (fl) and zip code (33110). The previous URL only indicated the state. If you dont like using all the folders, you can simply have a longer file name with dashes in between the words.
-
If you use rewrites, make sure Google is not spidering the original URLs otherwise you get penalized for duplicate errors. Monitoring Webmaster Tools or using spider software will help you find the holes. You can then use things like Canonical Links and Noindex Tags to get the old URLs out of the index and make sure Google has the correct pages. This all depends on how you implement your rewrites.
-
If you take some time to look at how you want to organized your site to start with then the first two items will take care of themselves usually. A good exercise is to write down how all of this would work within a breadcrumb navigation. This forces you to get organized and also helps you setup how you want all your pages to be shown to Google. If you do start to add parameters on top of this basic structure like pagination or other sortable options, you need to think how you would noindex, follow those pages to make sure that your main page would rank for a given key phrase vs all the other sorted versions of the same page.
-
One thing that is overlooked in setting up this kind of structure is that you can use it to your advantage in your analytic tools to look at global trends to your site. This could be in any site. Using the example above, all US states are at the 2nd level directory, cities are 3rd and zip is 4th. Makes it really super easy to use a regexp on urls to group them. For example, you could setup a filter in you analytics to easily combine all sessions that looked at pages in Florida and wanted to see what the next action was.
Cheers!
-
Hi,
I think that this does not mean, that you have to avoid rewriting dynamic urls at all but take care of the accessibility of your information.
for your url it could be interesting to build your domain like:
/locator/florida/find?radius=60
/locator/24786/find?radius=60or even better:
/stores-near-florida/find?range=60 /stores-near-24786/find?range=60
the suggestion of google just sais, that you have to avoid that information is being lost by mapping your dynamic url to a static. you should leave the radius parameter in the url because google could vary this parameter.
-
correction the pages are found by Google
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Not Indexing Pages (Wordpress)
Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.
Technical SEO | | Hasanovic1 -
Old Content Pages
Hello we run a large sports website. Since 2009 we have been doing game previews for most games every day for all the major sports..IE NFL, CFB, NBA, MLB etc.. Most of these previews generate traffic for 1-2 days leading up to or day of the event. After that there is minimal if any traffic and over the years almost nothing to the old previews. If you do a search for any of these each time the same matchup happens Google will update its rankings and filter out any old matchups/previews with new ones. So our question is what would you do with all this old content? Is it worth just keeping? Google Indexes a majority of it? Should we prune some of the old articles? The other option we thought of and its not really practical is to create event pages where we reuse a post each time the teams meet but if there was some sort of benefit we could do it.
Technical SEO | | dueces0 -
Why is the Page Authority of my product pages so low?
My domain authority is 35 (homepage Page Authority = 45) and my website has been up for years: www.rainchainsdirect.com Most random pages on my site (like this one) have a Page Authority of around 20. However, as a whole, the individual pages of my products rank exceptionally low. Like these: http://www.rainchainsdirect.com/products/copper-channel-link-rain-chain (Page Authority = 1) http://www.rainchainsdirect.com/collections/todays-deals/products/contempo-chain (Page Authority = 1) I was thinking that for whatever reason they have such low authority, that it may explain why these pages rank lower in google for specific searches using my exact product name (in other words, other sites that are piggybacking of my unique products are ranking higher for my product in a specific name search than the original product itself on my site) In any event, I'm trying to get some perspective on why these pages remain with the same non-existent Page Authority. Can anyone help to shed some light on why and what can be done about it? Thanks!
Technical SEO | | csblev0 -
Wordpress Archive pages
In the SEOMOZ site report a number of errors were found. One of which was no or duplicate meta desctions on certain blog pages. When I drilled down to find these i noticed thosepages are the wordpress autocreated archive pages. When I searched for these through the wordpress control panel through both pages and blogs they were nowhere to be found. Does anyone know how to find these pages or are they not something I need to worry about?
Technical SEO | | laserclinics0 -
How To SEO Mobile Pages?
hello, I have finally put my first foot on the path of trying to learn and understand mobile SEO. I have a few questions regarding mobile SEO and how it works, so please help me out. I use wordpress for my site, and there is a nifty plugin called WP touch http://wordpress.org/extend/plugins/wptouch/ What it basically does is, it converts your desktop version into a mobile friendly version. I wanted to know that if it does that, does this mean whatever SEO i do for my regular web site gets accomplished for my moible version as well? Another simple question is, if i search for the same term on my mobile phone then on my desktop how different will the SERs be? thanks moz peeps
Technical SEO | | david3050 -
Consolidate page strength
Hi, Our site has a fair amount of related/similiar content that has been historically placed on seperate pages. Unfortuantely this spreads out our page strength across multiple pages. We are looking to combine this content onto one page so that our page strength will be focused in one location (optimized for search). The content is extensive so placing it all on one page isn't ideal from a user experience (better to separate it out). We are looking into different approaches one main "tabbed" page with query string params to seperate the seperate pages. We'll use an AJAX driven design, but for non js browsers, we'll gracefully degrade to separate pages with querystring params. www.xxx.com/content/?pg=1 www.xxx.com/content/?pg=2 www.xxx.com/content/?pg=3 We'd then rel canonical all three pages to just be www.xxx.com/content/ Same concept but useAJAX crawlable hash tag design (!#). Load everything onto one page, but the page could get quite large so latency will increase. I don't think from an SEO perspective there is much difference between options 1 & 2. We'll mostly be relying on Google using the rel canonical tag. Have others dealt with this issue were you have lots of similiar content. From a UX perspective you want to separate/classifiy it, but from an SEO perspective want to consolidate? It really is very similiar content so using a rel canonical makes sense. What have others done? Thoughts?
Technical SEO | | NicB10 -
If you only want your home page to rank, can you use rel="canonical" on all your other pages?
If you have a lot of pages with 1 or 2 inbound links, what would be the effect of using rel="canonical" to point all those pages to the home page? Would it boost the rankings of the home page? As I understand it, your long-tail keyword traffic would start landing on the home page instead of finding what they were looking for. That would be bad, but might be worth it.
Technical SEO | | watchcases0 -
Duplicate Page Content and Title for product pages. Is there a way to fix it?
We we're doing pretty good with our SEO, until we added product listing pages. The errors are mostly Duplicate Page Content/Title. e.g. Title: Masterpet | New Zealand Products MasterPet Product page1 MasterPet Product page2 Because the list of products are displayed on several pages, the crawler detects that these two URLs have the same title. From 0 Errors two weeks ago, to 14k+ errors. Is this something we could fix or bother fixing? Will our SERP ranking suffer because of this? Hoping someone could shed some light on this issue. Thanks.
Technical SEO | | Peter.Huxley590