Dynamic pages and code within content
-
Hi all,
I'm considering creating a dynamic table on my site that highlights rows / columns and cells depending on buttons that users can click. Each cell in the table links to a separate page that is created dynamically pulling information from a database.
Now I'm aware of the google guidelines: "If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few."
So we wondered whether we could put the dynamic pages in our sitemap so that google could index them - the pages can be seen with javascript off which is how the pages are manipulated to make them dynamic.
Could anyone give us a overview of the dangers here?
I also wondered if you still need to separate content from code on a page? My developer still seems very keen to use inline CSS and javascript!
Thanks a bundle.
-
That's it! Thanks Baptiste - the anchor text is going to be an image and an unoptimised image at that... I knew there was a reason that my brain was kicking up a fuss with this code. Brilliant, thanks.
-
If you have pages on your site without any internal link to them (I mean pages reachable by Google Bot when he visits your site), they won't rank well, and your site won't be ranking well too.
That's because your pages will only be visible to Google through the sitemap. The Google Bot will think "Hey these pages are not accessible to the user through the site links? I should not rank them then."
You will penalize your whole domain too because you will be losing additional pages in the site. Especially if those pages have good content (I suppose this the case anyway).
BUT, given your example, Google Bot should be able to access them with you strange link. Question is, what is the anchor text ? Is the anchor a generic text or a good keyword for each page? If you have bad anchor text, I would make a specific section on the site which allows the user and GBot to access theses pages with good anchor text.
Maybe a HTML sitemap, as you may see one on rotten tomatoes :
-
Ok, thanks so much for the advice, just to clarify there is a link to the dynamic page and it's address will be permanent so it can be accessed in the browser address bar. Here is the link example that the developer gave:
[If I understand correctly you are saying that this dynamic page can't rank? Even if it is permanently there?
Is that right?](Index.asp?Page=ExamplePage)
-
agreed, ref=canonical
-
I would rel=canonical all versions of the page to point to one without querystring. You only want one page to be indexed.
I would also (sorry) reconsider your ties with a developer who wants to use inline CSS. That is just dumb...
-
I would not recommend doing so, google will index you pages but he won't be able to find them anywhere on the internet, event on your website. They won't rank.
You should make, at least, a way for users and google to access theses pages without the buttons.
For the code / content / style issue, you should really stop putting all together - it's been 10 years since people started to separate content and style
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regarding rel=canonical on duplicate pages on a shopping site... some direction, please.
Good morning, Moz community: My name is David, and I'm currently doing internet marketing for an online retailer of marine accessories. While many product pages and descriptions are unique, there are some that have the descriptions duplicated across many products. The advice commonly given is to leave one page as is / crawlable (probably best for one that is already ranking/indexed), and use rel=canonical on all duplicates. Any idea for direction on this? Do you think it is necessary? It will be a massive task. (also, one of the products that we rank highest for, we have tons of duplicate descriptions.... so... that is sort of like evidence against the idea?) Thanks!
Web Design | | DavidCiti0 -
Moving position on a page
I would like to know if the use of adding #nameofthesection in a URL with the objective of moving position on a webpage might be somehow harmful? IE. www.mydomain.es/people-and-planet/sustainable-life-at-home/index.html www.mydomain.es/people-and-planet/sustainable-life-at-home/index.html#saving-water Many thanks for your help!
Web Design | | victoriaht0 -
Site with no ads hit by Page Layout update?
Hi there! Can a site that has no ads on it be hit by Google's latest Page Layout update? Can it be hit for just one or two keywords? My site (www.ink2paper.com) has a decline in Google organic traffic in early Feb so my suspicion is the Page Layout update. However I have no ads on the site. Digging into GWMT I find that it is only one or 2 keywords that seems to have taken a dive, mainly [photo paper]. I used to get around 80 imps a day for this term. Then on 6 Feb it was down to 50; 7 Feb = 34; 8 Feb just 4 impressions! I got a spike back at usual levels on 10 & 11 Feb, but since then it has been back down to only 5 or so impressions a day. [photographic paper] took a small hit at the start of February, but has nose dived since the start of April. The homepage performs well for Google organic traffic - low bounce (22%) and good ecom conversion rate (14%) - although this is likely to be largely branded traffic. I feel my site is a 'good' result for the search term [photo paper], although there is always room for improvement of course! Any suggestions as to why Google has stopped showing my site for these keywords? All help is greatly appreciated. Cheers,
Web Design | | SimonHogg
Simon0 -
Multiple Local Schemas Per Page
I am working on a mid size restaurant groups site. The new site (in development) has a drop down of each of the locations. When you hover over a location in the drop down it shows the businesses info (NAP). Each of the location in the Nav list are using schema.org markup. I think this would be confusing for search robots. Every page has 15 address schemas and individual restaurants pages NAP is at the below all the locations' schema/NAP in the DOM. Have any of you dealt with multiple schemas per page or similar structure?
Web Design | | JoshAM0 -
Increasing content, adding rich snippets... and losing tremendous amounts of organic traffic. Help!
I know dramatic losses in organic traffic is a common occurrence, but having looked through the archives I'm not sure that there's a recent case that replicates my situation. I've been working to increase the content on my company's website and to advise it on online marketing practices. To that end, in the past four months, I've created about 20% more pages — most of which are very high quality blog posts; adopted some rich snippets (though not all that I would like to see at this point); improved and increased internal links within the site; removed some "suspicious" pages as id'd by Moz that had a lot of links on it (although the content was actually genuine navigation); and I've also begun to guest blog. All of the blog content I've written has been connected to my G+ account, including most of the guest blogging. And... our organic traffic is preciptiously declining. Across the board. I'm befuddled. I can see no warnings (redirects &c) that would explain this. We haven't changed the site structure much — I think the most invasive thing we did was optimize our title tags! So no URL changes, nothing. Obviously, we're all questioning all the work I've done. It just seems like we've sunk SO much energy into "doing the right thing" to no effect (this site was slammed before for its shady backlink buying — though not from any direct penalty, just as a result of the Penguin update). We noticed traffic taking a particular plunge at the beginning of June. Can anyone offer insights? Very much appreciated.
Web Design | | Novos_Jay0 -
Duplicate Content & Canonicals
I am a bit confused about canonicals and whether they are "working" properly on my site. In Webmaster Tools, I'm showing about 13,000 pages flagged for duplicate content, but nearly all of them are showing two pages, one URL as the root and a second with parameters. Case in point, these two are showing as duplicate content: http://www.gallerydirect.com/art/product/vincent-van-gogh/starry-night http://www.gallerydirect.com/art/product/vincent-van-gogh/starry-night?substrate_id=3&product_style_id=8&frame_id=63&size=25x20 We have a canonical tag on each of the pages pointing to the one without the parameters. Pages with other parameters don't show as duplicates, just one root and one dupe per listing, So, am I not using the canonical tag properly? It is clearly listed as:Is the tag perhaps not formatted properly (I saw someone somewhere state that there needs to be a /> after the URL, but that seems rather picky for Google)?Suggestions?
Web Design | | sbaylor0 -
Question #1: Does Google index https:// pages? I thought they didn't because....
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored) My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one. The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/ instead of **http://**www.example.com/example-page/ To double check that this was causing a loss in Link Juice. I jumped over to OSE. Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed. So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed... Right?? Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed. The problem is.. is this a volusion problem? Should I switch to Wordpress? here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress) http://www.uncommonthread.com/
Web Design | | TylerAbernethy0 -
How much content is too much? Best Pages For Content?
To my understanding content has a lot to do with organic rankings if written correctly. My question is, how much content is too much and what pages are best to place content. Our company sells very costly products. Our customers call to purchase, we do not have an eCommerce site. Write now we have on average 350 words per page. We have about 200+ pages. Each page is written for that general category and each product has its own unique content. It seems to me that the pages with less content, tend to rank a bit better. As we are in the process of redoing our website, is there any recommendations on writing content, or adjusting the amount of text. I am thinking a lot of our text is informative only to a certain extent. Would writing content just for the main category page be better, and then on the actual product page, have only about 250 words as a description? Are there any other recommendations for SEO that are fairly new? Besides the Title, Description, Heading Tags, Image Alts, URLS etc.
Web Design | | hfranz0