How does using a CMS (i.e. Wordpress/Drupal) affect backlinks and SEO?
-
So I need to build a website with over 100 pages in it. Elements of the design will probably be moved around and or tested so I need to use a CMS. It's pretty much a review site so while the content will remain static I'd like to employ A/B testing to mess with conversion rates. Wordpress has a plugin for that even.
So I'm just wondering, since CMS pages are pretty much created on spot and not retrieved from a library, how this affects backlinks and anchor text? How exactly does the external website point to yours if the URL is dynamically generated?
Or am I misunderstanding something? Please recommend any extra resources as well if you can.
-
Sorry, saw the follow-up, but I think the overall thread has you covered. The only real issue with CMS URLs is that you can sometimes have multiple versions pointing to the same page, and this creates duplicate content. There are plug-ins for WordPress that can help with that.
The only exception would be something like an AJAX-style URL, where the page content could change without the URL ever changing (Flash has the same issue, for example). You'll rarely see that in a standard CMS, though, and definitely not in WordPress.
-
Thank you very much CMC-SD, Jared and goodlegaladvice for all your help.
@CMC-SD: As promised, I stole your analogy (Now I realize it was an analogy and not a metaphor, I think) and I tried to explain CMS to my girlfriend who knows nothing about computers. Unfortunately it did not come out as elegantly as you put it and we ended up eating bison burgers instead.
-
Ditto to that Jared. Great explanation. And now I'm hungry.
-
Oh, okay, I definitely misunderstood. You're asking about the back-end rewriting process that makes a pretty URL point to the corresponding ugly URL which in turn points to the page. That's way back-end. Unlike a 301 redirect, it's invisible to the spider. The spider need never know that a URL like http://www.domain.com/?p=123 even exists. While it's crawling, it sees a link to http://www.domain.com/page1.html, follows the link, and sees the HTML for that page. That's all.
-
@CMS-SD: Great metaphor! I'm going to steal it But I already knew that about CMS's xD. In fact my confusion was about what follows from that... If the pages are created dynamically and not retrieved from the webserver itself, how do would a backlink even REFER it??
I actually found this SEO blog touching on the subject matter: http://www.seomoz.org/blog/url-rewrites-and-301-redirects-how-does-it-all-work
So, pretty much this is how it works: A page is linked through the URL that is randomly generated by a CMS, but the webserver rewrites the URL that points to the original URL. Pretty much the same thing. And google indexes that URL plus the html on the page. Is that about right? That is why I should not worry at all.
-
Thanks! That's what happens when a creative writing major learns php.
-
This is probably the most well constructed, and humorous explanation on this that I have ever read. Bravo.
-
No. What "indexing" means is creating a database of URLs and the HTML that those URLs point to. If your site has been "indexed," it means Google has discovered your URLs and taken note of the HTML that can be found at those URLs.
-
I think you are misunderstanding something, yes.
On a website with a CMS, the URL is not "dynamically generated." The page is dynamically generated. Here's what that means. Whenever you type http://www.domain.com/page1.html into your browser, you are telling your browser to go to that website and pull up the HTML that corresponds to that URL. URL stands for "uniform resource locator," meaning directions to the location of a resource. If you have an old-fashioned website, the URL points to an HTML file that you created, either by typing everything yourself of using a WYSIWYG editor. If you have a CMS, the URL essentially instructs your website to build the corresponding HTML page on the fly.
It's like ... okay, imagine that you walk into a bakery and ask for a chocolate chip cookie. They could either pull a pre-baked chocolate chip cookie off the shelf and hand it to you, or walk in the back and bake you one cookie from the ingredients in the kitchen. When we're talking about baked goods, option 1 is almost always better than option 2 because it's orders of magnitude faster and more efficient. The benefits that option 2 offers aren't worth the extra time and lost efficiency. But when we're talking about websites, that's no longer the case. The server can construct an HTML document almost instantaneously. Your browser gets the HTML just as fast as it would if it asked for a static HTML page.
In fact, your browser really has no idea that this is all happening. Here's another food metaphor. You walk into a fast food joint and order a hamburger. The cashier walks into the kitchen, and a minute later, walks out with your hamburger. Did the cashier pull the hamburger off a shelf of hamburgers that have been sitting under a hotlight for hours? Or did the cashier ask the cook to prepare a fresh hamburger just for you? Assuming the hamburger tastes great either way, you have no way of knowing. In this metaphor, the customer is the surfer, the cashier is the browser, and the kitchen is the server your website is hosted on. Either your server has a bunch of pre-made pages sitting around waiting for someone to "order" them, or your server has a clever program that makes the pages only when they're needed. That clever program, the CMS, is like the short-order cook.
The thing to remember is, the search engine spiders are customers, just like the surfer. They don't know what's going on in the kitchen. They don't care. They "typed in" a URL and got some HTML back. They now know that that URL produces that HTML. They remember that. When they see a link to that URL, they know it's pointing to that HTML.
Clear as mud?
-
Ahhh, so Google indexes URLs and not the pages themselves? D'oh.
-
"So I'm just wondering, since CMS pages are pretty much created on spot and not retrieved from a library, how this affects backlinks and anchor text? How exactly does the external website point to yours if the URL is dynamically generated?"
Firstly, different CMS's create pages differently. CMS just means content management, which means the platform just provides a gui for you to add content or make changes. If you are using WP and creating pages, then these pages wil be indexed as any other page, and links pointing to it would simply target the page's URL.
Wordpress uses permalinks and Drupal uses pathauto to redirect platform generated links into SEO friendly one. They use an internal redirect and the resulting URL is indexed in Google. Therefore, you simply treat the resulting URL as the "real" url, and external links to it work fine.
-
right no difference I took a whole site with statics and changed it over to a cms with all rewrites everything works great kept the urls the same though kept the .htm
-
I was under the impression that URL rewrites just change the way the URL is displayed on the browser but not the URL itself. I really need to learn more about the backend stuff.
So it would make no difference if the backlink contained an absolute path?
-
It works the same as a static page except its easy to manage your content....
You usually also use a url re-write agent that can change your urls to say what ever you want. In fact most of the web is now on a cms.
Backlinks, Anchor text is all the same....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Safest Route to Get from Drupal to Wordpress
Hi all, I have a prospective client who currently has a Drupal 7 site and they are interested in converting it to Wordpress (or simply building it new in Wordpress) and changing CMS. They would like to update the look of their site to make it a little more modern, but they are very concerned about preserving their SEO and rank, etc so they plan to keep the structure and urls, etc. pretty identical to what they have now with only minor cosmetic changes for the moment. I want to find the best and most economical route for them to take for a change like this. If an actual conversion/migration is the way to go, then I also want to help them find a professional developer they can trust to do the job correctly as I am not a Drupal person. I guess my question concerns what would be the safest and most economical route to preserve the SEO and rank, etc. that they have now? 1) Would that be an actual migration and conversion of their current site into a custom WP theme that's structured as closely as possible to the site currently? 2) Or, depending on the ultimate complexity of the site, would it be possible to leave out the step of migration altogether? Would there be huge cons or red flags to starting from scratch and manually building a custom WP theme while duplicating the architecture/links/urls, etc. as closely as possible to the current site? Anything else to be worried about if this is a possibility? 3) Or would you suggest some other way entirely? Thank you so much in advance for any guidance or suggestions on this! They are really nice people and I just want to make sure they do this safely because they get a lot of business from their site and they are justified to worry.
Web Design | | Pixelwik0 -
In Wordpress getting marked as duplicate content for tags
Moz is marking 11 high priority items for duplicate content. Just switched to wordpress and publishing articles for the site but only have a few. The problem is on the tag pages. Since there aren't very many articles so when you go to the tag pages it lists one or two articles and hence there are pages with duplicate content. Most of the articles have the same tags / categories. Perhaps I'm using too many tags and categories? I'm using about 7 tags and around 2 categories for each post / event. I've read the solution is using canonical tags but a little confused on which page I should use for the tag and then I believe I need to point the duplicate pages to the correct page. For example, I have two events that are for dances and both have the same tags. So when you visit, site.com/tags/dance or site.com/events both pages have the same articles listed. Which page do I select as having the original content? Does it matter? Does that make sense? Someone was also saying I could use the Yoast plugin to fix, but not really seeing anything in the Yoast tools. I also see 301 redirects mentioned as a solution but the tag pages will be changing as we add new articles and they have a purpose so not really seeing that as a solution.
Web Design | | limited70 -
Redirects Not Working / Issue with Duplicate Page Titles
Hi all We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
Web Design | | HireSpace0 -
So apparently SEO moz will get us de-indexed according to a SEO company!
Each and every day i get called up from an SEO company who promises to get me top spots in Google rankings if i quickly get on their special offer they have today normally i would say "no thanks and put the phone down" but i had a bit of spare time so i indulged the guy and we got talking. After the introductions and speal about his company he was showing me what his company does and how they go about it to get me top ranks (they don't get me ranks but create a website they own which then passes leads to me- kinda clever since they could then start charging me per lead or my competitors) We continued to talk and i mentioned i used SEOmoz to check my rankings and back links etc and he told me that Google are cracking down and anyone using these types of software/websites will get their websites de indexed. This struck me as BS but i wanted to get your thoughts on the matter, i personally don't believe Google would ever do such a thing as this since it would be so easy to get your competitors websites taken down (i.e. negative seo) but its certainly a talking point.
Web Design | | GarethEJones0 -
Wordpress or Joomla member function - HELP!
Does anyone know of a plugin/theme/template that would allow me to create a site with the following features: 1 section of content that is public 1 section of content that is only for registered members and that is hidden until you log in. A registration page A sign up form
Web Design | | MassivePrime0 -
Basekit.com anybody using it?
I need help to built a mulitlanguage website. After posting questions, I most admit Wordpress with a premium template might be the options for all the features I need. Thoses are: SEO Frendly (custom links...html...Alt Tag...Metadata...)
Web Design | | BigBlaze205
Mobile Version
Great Template
Multilanguage Complete web site
Google Analytics
Blog
301 redirect Some one suggest basekit.com Do you thing it might be my solution? Thank you for your help. BigBlaze0 -
Should /dev folder be blocked?
I have been experiencing a ranking drop every two months, so I came upon a new theory this morning... Does Google do a deep crawl of your site say every 60-90 days and would they penalize a site if they crawled into your /dev area which would contain pretty the exact same urls and content as your production environment and therefore penalize you for duplicate content? The only issue I see with this theory is that I have been penalized only for specific keywords on specific pages, not necessarily across the board. Thoughts? What would be the best way to block out your /dev area?
Web Design | | BoulderJoe0 -
Local SEO - Title Tag?
For www.bluedotlandscaping.com/fencing.htm in which we mostly only care about Greenvile and Spartanburg counties in SC. Is <title>Patio designs - Water features - Brick patio by Blue Dot Landscaping</title>
Web Design | | SCyardman
good... or or do you prefer... <title>Patio designs - Water features - Brick patio - Greenville, Spartanburg, Simpsonville</title> Thanks for your help, Rich0