How do we ensure our new dynamic site gets indexed?
-
Just wondering if you can point me in the right direction. We're building a 'dynamically generated' website, so basically, pages don’t technically exist until the visitor types in the URL (or clicks an on page link), the pages are then created on the fly for the visitor.
The major concern I’ve got is that Google won’t be able to index the site, as the pages don't exist until they're 'visited', and to top it off, they're rendered in JSPX, which makes things tricky to ensure the bots can view the content
We’re going to build/submit a sitemap.xml to signpost the site for Googlebot but are there any other options/resources/best practices Mozzers could recommend for ensuring our new dynamic website gets indexed?
-
Hi Ryan,
Mirroring what Alan said, if the links are html text links - and they should be - then you will reduce your crawling problem with Google.
If you must use javascript links, make sure to duplicate them using
<noscript>tags so that Google will follow them.</p> <p><a href="http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355">http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355</a></p> <p>But be careful, Google doesn't treat <noscript> links like regular html links. At best, it's a poor alternative.</p> <p>Google derives so many signals from HTML links (anchor text, page rank, context, etc) that it's almost essential for a search engine friendly site to include them.</p> <p>The Beginners Guide to SEO has a relevant chapter on the basics of Search Engine Friendly Design and Development:</p> <p><a href="http://www.seomoz.org/beginners-guide-to-seo/basics-of-search-engine-friendly-design-and-development">http://www.seomoz.org/beginners-guide-to-seo/basics-of-search-engine-friendly-design-and-development</a></p> <p>Best of luck!</p></noscript>
-
Definitely want to get it right before launch. It's not going anywhere until it is absolutely ready!
-
The project this reminds me of took six months to complete and the 301's alone were a full time job.
Get it right the first time... you do not want to restructure like this on a large dynamic site.
I must say the project worked out but I got all my grey hair the day we threw the switch...
-
When I say its costly to rewrite 200,000+ URLS I mean it. Correcting mistakes here can cost big dollars.
In this case it wascostly to the tune of $60,000+ in costs and loss, however the bottle of bubbly at the end of the six month project was tasty.
Point being is to do it right the first time.
As I said before your best bet is documentation. Large dynamic sites generate large dynamic problems very quickly if not watched closely.
-
Thank you Khem, very helpful replies.
-
One more thing, I missed. Internal linking, make sure each of the page is linked with some text link. But avoid over linking. don't try to link all the pages from home page. Generally we links all the categories, pages from footer or site-wide links
-
Okay, lets do it step by step.
First, if it's a product website, create a separate feed for products and submit the sitemap with Google.
if not, that may you would have separate news/articles/videos sections, create separate xml sitemap for each section and submit with Google
If not, make sure to have only search engine friendly URLs, who says rewriting 200,000+ pages is costly, compare this cost with the business you'll loose when all your products would be listed in Google. So, make sure to rewrite all the dynamic URLs, if you feel that Google might face problem in crawling your website's URLs
Second, study webmaster tool's data very carefully for warnings, errors, so that you can figure out the issues which Google might have been facing while visits your websites.
Avoid duplicate entries of products, generally we don't pay attention to these things, and show same products on different pages in different categories. Google will filter all those duplicate pages, and can even penalize your website because of the duplicate content issue.
Third, keep promoting, but avoid grey/black hat techniques, there is no shortcut to the success. you'll have to spend time and money.
-
It's definitely something we're taking a very close look at. Another thing not mentioned is the use of canonical tags to head off duplicate content issues, which I'll be ensuring is implemented.
My next mugshot might have significantly grayer hair after this is all done...
-
Thanks very much for the replies.
I'll ensure proper cross linking from navigation, on pages themselves and submit a full XML sitemap, along with the social media options suggested. My other concern is that the content itself won't be visible to Googlebot due to the site being largely javascript driven, but that's something I'm working with the developers to resolve.
-
As you can tell from the response above indexation is not what you should be worried about.
Dynamic content is not fool proof. The mistakes are costly and you never want to be involved rewriting 200,000+ pages of dynamic rats nest.
Sorting abilities can cause dynamic urls and duplicate content.
Structure changes or practice changes can cause crawl errors. I looked at a report for a client early today that had 3000+ errors today compared to 20 last week. This was all due to a request made by the owner to the developer.
When enough attention is not paid to this stuff it causes real issues.
The best advice I can offer is to make sure you have a best practices document that must be followed by all developers.
-
Make sure every page you would like to be crawled is linked to in any matter. You can create natural links to them, e.g. from your navigation or in text links, or you can put them in a sitemap.
You can also link to these pages from websites like facebook, twitter to have fast crawling.
Tell Google in your robots.txt that it can access your website and make sure non of the pages you would like to be indexed carry the noindex-value in the robots meta-tag.
Good luck!
-
any link, but i should correct what i said, they will be crawled, not necessary indexwed
-
Thanks for the reply Alan, do you mean links from the sitemap?
-
If you have links to the pages they will be indexed, dynamic of static it does not matter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated pages are being indexed?
I have lots of paginated pages which are being indexed. Should I add the noindex tag to page 2 onwards? The pages currently have previous and next tags in place. Page one also has a self-referencing canonical.
Technical SEO | | WTH0 -
Site address change: new site isn't showing up in Google, old site is gone.
We just transitioned mccacompanies.com to confluentstrategies.com. The problem is that when I search for the old name, the old website doesn't come up anymore to redirect people to the new site. On the local card, Google has even taken off the website altogether. (I'm currently still trying to gain access to manage the business listing) When I search for confluent strategies, the website doesn't come up at all. But if I use the site: operator, it is in the index. Basically, my client has effectively disappeared off the face of the Google. (In doing other name changes, this has never happened to me before) What can I do?
Technical SEO | | MichaelGregory0 -
Upgrade old sitemap to a new sitemap index. How to do without danger ?
Hi MOZ users and friends. I have a website that have a php template developed by ourselves, and a wordpress blog in /blog/ subdirectory. Actually we have a sitemap.xml file in the root domain where are all the subsections and blog's posts. We upgrade manually the sitemap, once a month, adding the new posts created in the blog. I want to automate this process , so i created a sitemap index with two sitemaps inside it. One is the old sitemap without the blog's posts and a new one created with "Google XML Sitemap" wordpress plugin, inside the /blog/ subdirectory. That is, in the sitemap_index.xml file i have: Domain.com/sitemap.xml (old sitemap after remove blog posts urls) Domain.com/blog/sitemap.xml (auto-updatable sitemap create with Google XML plugin) Now i have to submit this sitemap index to Google Search Console, but i want to be completely sure about how to do this. I think that the only that i have to do is delete the old sitemap on Search Console and upload the new sitemap index, is it ok ?
Technical SEO | | ClaudioHeilborn0 -
Removed Subdomain Sites Still in Google Index
Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere. I have a site that at one point had several development sites set up at subdomains. Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index. However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com." Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index? Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help. Thanks!!
Technical SEO | | SarahLK0 -
Domain recommendations for non indexed sites
For complicated internal reasons (which I swear have validity) we are starting websites that will then disallow from Google for ~1 year. Is there any value in purchasing "premium" domains with SEO juice or will we lose the value in the time we are hiding the website from Google?
Technical SEO | | theLotter0 -
How do you know what version of your site of Google is in their index?
This is going to sound like a strange question, but I am trying to understand which version of our site is in the index. You might think this is an obvious question, but here is why I am asking: 1. Today I searched for a specific keyword and found the listing. 2. I liked on the right arrow next to the listing and checked the cache date. It says 6/28 and shows the site as of 6/28. 3. I expected to see that we were just indexed as we jumped several pages since yesterday and I had just checked two days ago and we hadn't moved at all. It seems like Google may have taken the changes we made on 7/2 but since it is showing 6/28, I am note sure. Since this is confusing, here is the chronology: 1. Made changes 6/20. 2. Site appeared to be indexed on 6/28. 3. Made changes on 7/2. 4. Checked the site on 7/2 and we were in position 60. Checked the site on 7/4 and we were in position 61. 5.. Checked the site today (7/6) and see we are in position 8. The cache date shows as 6/28. I suspect that Google just indexed us yesterday and is reflecting the changes I made on 7/2. But the fact that it says it was cached on 6/28 seems to sugges otherwise. I want to be sure I know which version got us the good rankings - is there any way to be sure? Thanks!!
Technical SEO | | trophycentraltrophiesandawards0 -
Redirecting a old aged site to a new exact match site?
Hi All, I have a question. I have 2 sites with me in the same sector and want some help. site 1 is a old site started back in 2003 and has some amount of links to it and has a pr 3 with some good links to it but doesn't rank much for any keywords for the timing. site 2 is a aged domain but newly developed with unique content and has a good amount of exact match with a .com version. so will there be any benefit by redirecting site 1 to site 2 to get the seo benefits and a start for link bulding? or is it best to develop and work on each site? the sector is health insurance. Thanks
Technical SEO | | macky71 -
I am trying to block robots from indexing parts of my site..
I have a few websites that I mocked up for clients to check out my work and get a feel for the style I produce but I don't want them indexed as they have lore ipsum place holder text and not really optimized... I am in the process of optimizing them but for the time being I would like to block them. Most of my warnings and errors on my seomoz dashboard are from these sites and I was going to upload the folioing to the robot.txt file but I want to make sure this is correct: User-agent: * Disallow: /salondemo/ Disallow: /salondemo3/ Disallow: /cafedemo/ Disallow: /portfolio1/ Disallow: /portfolio2/ Disallow: /portfolio3/ Disallow: /salondemo2/ is this all i need to do? Thanks Donny
Technical SEO | | Smurkcreative0