Does Google crawl and index dynamic pages?
-
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic.
Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567
Here's a sample category page:
Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
-
Andy's correct. The answer is most likely yes. So if screaming frog find these pages, most likely the Google bot will as well. Also, as long as the spider can find the page, it doesn't matter if it's static or dynamic. However, If you have multiple parameters, Google may decide it's not a useful page and won't index it. Make sure to include a sitemap with all these url's as it may help in getting them indexed in case your site architecture is poor.
-
For the most part, Google can find their way well around dynamic pages with few problems. There are always going to be exceptions, but this tends to be when there are lots of components to the URL structure.
As long as it is pretty simple, you shouldn't have any problems.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How good/bad the exit intent pop-ups? What is Google's perspective?
Hi all, We have launched the exit intent pop-ups on our website where a pop-up will appear when the visitor is about to leave the website. This will trigger when the mouse is moved to the top window section; as an attempt by the visitor to close the window. We see a slight ranking drop post this pop-up launch. As the pop-up is appearing just before someone leaves the website; does this making Google to see as if the user left because of the pop-up and penalizing us? What is your thoughts and suggestions on this? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Any more info on potential Google algo update from April 24th/25th?
Apart from an article on Search Engine Roundtable, I haven’t been able to find anything out about the potential algorithm update that happened on Monday / Tuesday of this week. One of our sites (finance niche) saw drops in rankings for bad credit terms on Tuesday, followed by total collapse on Wednesday and Thursday. We had made some changes the previous week to the bad credit section of the site, but the curious thing here is that rankings for bad credit terms all over the site (not just the changed section) disappeared. Has anyone else seen the impact of this change, and are there any working theories on what caused it? I’m even wondering whether a specific change has been made for bad credit terms (i.e. the payday loan update)?
White Hat / Black Hat SEO | | thatkinson0 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
Negative SEO to inner page: remove page or disavow links?
Someone decided to run a negative-SEO campaign, hitting one of the inner pages on my blog 😞 I noticed the links started to pile up yesterday but I assume there will be more to come over the next few days. The targeted page is of little value to my blog, so the question is: should I remove the affected page (hoping that the links won't affect the entire site) or to submit a disavow request? I'm not concerned about what happens to the affected page, but I want to make sure the entire site doesn't get affected as a result of the negative-SEO. Thanks in advance. Howard
White Hat / Black Hat SEO | | howardd0 -
Thin Content Pages: Adding more content really help?
Hello all, So I have a website that was hit hard by Panda back in 2012 November, and ever since the traffic continues to die week by week. The site doesnt have any major moz errors (aside from too many on page links). The site has about 2,700 articles and the text to html ratio is about 14.38%, so clearly we need more text in our articles and we need to relax a little on the number of pictures/links we add. We have increased the text to html ratio for all of our new articles that we put out, but I was wondering how beneficial it is to go back and add more text content to the 2,700 old articles that we have just sitting. Would this really be worth the time and investment? Could this help the drastic decline in traffic and maybe even help it grow?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Negative SEO attack working amazingly on Google.ca
We have a client www.atvandtrailersales.com who recently (March) fell out of the rankings. We checked their backlink file and found over 100 spam links pointing at their website with terms like "uggboots" and "headwear" etc. etc. I submitted a disavow link file, as this was obviously an attack on the website. Since the recent Panda update, the client is back out of the rankings for a majority of keyword phrases. The disavow link file that was submitted back in march has 90% of the same links that are still spamming the website now. I've sent a spam report to Google and nothing has happened. I could submit a new disavow link file, but I'm not sure if this is worth the time. '.'< --Thanks!
White Hat / Black Hat SEO | | SmartWebPros1 -
How do I know what links are bad enough for the Google disavow tool?
I am currently working for a client who's back link profile is questionable. The issue I am having is, does Google feel the same way about them as I do? We have no current warnings but have had one in the past for "unnatural inbound links". We removed the links that we felt were being referred to and have not received any further warnings, nor have we noticed any significant drop in traffic or rankings at any point. My concern is that if I work towards getting the more ominous looking links removed (directories, reciprocal links from irrelevant sites etc.), either manually or with the disavow tool, how can I be sure that I am not removing links that are in fact helping our campaign? Are we likely to suffer from the next Penguin update if we chose to proceed without moving the aforementioned links? or is Google only likely to target the serious black hat links (link farms etc.)? Any thoughts or experiences would be greatly appreciated.
White Hat / Black Hat SEO | | BallyhooLtd0 -
Penalised by Google - Should I Redirect to a new domain?
Last month my rankings dropped a couple of pages on Google and am no longer receiving as many visits from Google as I used to. It's coming up to summer which is the time my business naturally picks up yet I can't fix this problem. I have a crazy idea of redirecting my established site onto a new domain in hopes that the penalty would be removed. I have tried removing any manipulative links yet my ranking are not coming back. Anyone had success in redirecting to a new domain?
White Hat / Black Hat SEO | | penn730