What's the best Blogging platform
-
A year ago an SEO specialist evaluated my Wordpress site and said she had seen lower rankings for Wordpress sites--in general. We moved our site off any cms and design in html 5. Our blog, however, is still on Wordpress. I'm thinking about moving to the Ghost platform b/c I only a blog. The drawbacks are one author, no recent post lists, no meta tags. Is it worth it to move the site off Wordpress. Will it affect my rankings much if I have great content? Does anyone have experience with or opinions on Ghost?
-
Wordpress is absolutely one of the best blogging cms solutions today. We have a couple of websites too running Wordpress and everybody can maintain and update it.
-
I agree i think wordpress is the easiest and the most user friendly. Please you can change themes or customize to your site very easily as well.
-
I'm a strong fan of Wordpress, install Yoast plugin and make sure you put links in your blogs. Even a related posts plugin like YARPP will help with more internal links to other blogs. Wordpress usually needs a few SEO tweaks, but is a great CMS for being SEO friendly.
-
I have a lot of local website clients. I use the Wordpress platform exclusively for these websites and have had phenomenal success with rankings for these sites. One particular client's site was no where to be found on Google 2 years ago. They now dominate the top three positions on Google's 1st page on over 100 relevant keywords. (These are strong head terms and not just long tail terms)
So I'm a strong supporter of Wordpress. Of course you have to work on optimizing your pages, but Wordpress makes it so easy and quick to make changes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New website's ranking dropped
Hi, Im working on brand new website i didn't even start my link building yet, just added to local directories i slowly started getting my ranking on 3rd page of Google then few weeks ago my ranking fell for all the keywords so now the website doesn't even rank on 10th page. Its been like this for a few weeks now. Here's the website Screenshot http://screencast.com/t/wDWk8sxLw Thanks for your help
Technical SEO | | mezozcorp0 -
Creating in-text links with ' 'target=_blank' - helping/hurting SEO!?!
Good Morning Mozzers, I have a question regarding a new linking strategy I'm trying to implement at my organization. We publish 'digital news magazines' that oftentimes have in-text links that point to external sites. More recently, the editorial department and me (SEO) conferred on some ways to reduce our bounce rate and increase time on page. One of the suggestions I offered is to add the 'target=_blank" attribute to all the links so that site visitors don't necessarily have to leave the site in order to view the link. It has, however, come to my attention that this can have some very negative effects on my SEO program, most notably, (fake or inaccurate) time(s) on-page. Is this an advisable way to create in-text links? Are there any other negative effects that I can expect from implementing such a strategy?
Technical SEO | | NiallSmith0 -
Moz Reporting Incorrect 404's
Hi Guys SEOMoz is telling me that we have 191 404 errors f. I have checked this with several other crawlers and this not the case. For example, http://www.opticalexpress.co.uk/eyecare/corporate-savings.html%0D%0A2027 But correct links its http://www.opticalexpress.co.uk/eyecare/corporate-savings.html which is fine... We have no record of these links so why is it appending these characters at the end of the URL which is causing the 404's....
Technical SEO | | EwanFisher0 -
Client's site dropped completely for all keywords, but not brand name - not manual penalty... help!
We just picked up a new search client a few weeks ago. They've been a customer (we're an automotive dealer website provider) since October of 2011. Their content was very generic (came from the previous provider), so we did a quick once-over as soon as he signed up. Beefed up his page content, made it more unique and relevant... tweaked title tags... wrote meta descriptions (he had none). In just over a week, he went from ranking on page 4 or 5 for his terms to ranking on page 2 or 3. My team was working on getting his social media set up, set up his blog, started competitor research... And then this last weekend, something happened and he dropped completely from the rankings... He still shows up if you do a site: search, or if you search his exact business name, but for everything else, he's nowhere to be found. His URL is www.ohioautowarehouse.com, business name is "Ohio Auto Warehouse" We filed a reconsideration request on Monday, and just got a reply today that there was no manual penalty. They suggested we check our content, but we know we didn't do anything spammy or blackhat. We hadn't even fully optimized his site yet - we were just finishing up his competitor research and were planning on a full site optimization next week... so we're at a complete loss as to what happened. Also, he's not ranking for any of the vehicles in his inventory. Our vehicle pages always rank on page 1 or 2, depending on how big the city is... you can always search "year make model city" and see our customers' sites (whether they're doing SEO or not). This guy's cars aren't showing up... so we know something is going on... Any help would be a lifesaver. We've been doing this for quite some time now, and we've never had a site get penalized. Since the reconsideration request didn't help, we're not sure what to do...
Technical SEO | | Greg_Gifford0 -
Sitemap coming up in Google's index?
I apologize if this question's answer is glaringly obvious, but I was using Google to view all the pages it has indexed of our site--by searching for our company and then clicking the link that says to display more results for the site. On page three, it has the sitemap indexed as if it wee just another page of our site. <cite>www.stadriemblems.com/sitemap.xml</cite> Is this supposed to happen?
Technical SEO | | UnderRugSwept0 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10 -
Domain Transfer Process / Bulk 301's Using IIS
Hi guys - I am getting ready to do a complete domain transfer from one domain to another completely different domain for a client due to a branding/name change. 2 things - first, I wanted to lay out a summary of my process and see if everyone agrees that its a good approach, and second, my client is using IIS, so I wanted to see if anyone out there knows a bulk tool that can be used to implement 301's on the hundreds of pages that the site contains? I have found the process to redirect each individual page, but over hundreds its a daunting task to look at. The nice thing about the domain transfer is that it is going to be a literal 1:1 transfer, with the only things changing being the logo and the name mentions. Everything else is going to stay exactly the same, for the most part. I will use dummy domain names in the explanation to keep things easy to follow: www.old-domain.com and www.new-domain.com. The client's existing home page has a 5/10 GPR, so of course, transferring Mojo is very important. The process: Clean up existing site 404's, duplicate tags and titles, etc. (good time to clean house). Create identical domain structure tree, changing all URL's (for instance) from www.old-domain.com/freestuff to www.newdomain.com/freestuff. Push several pages to a dev environment to test (dev.new-domain.com). Also, replace all instances of old brand name (images and text) with new brand name. Set up 301 redirects (here is where my IIS question comes in below). Each page will be set up to redirect to the new permanent destination with a 301. TEST a few. Choose lowest traffic time of week (from analytics data) to make the transfer ALL AT ONCE, including pushing new content live to the server for www.new-domain.com and implementing the 301's. As opposed to moving over parts of the site in chunks, moving the site over in one swoop avoids potential duplicate content issues, since the content on the new domain is essentially exactly the same as the old domain. Of course, all of the steps so far would apply to the existing sub-domains as well, IE video.new-domain.com. Check for errors and problems with resolution issues. Check again. Check again. Write to (as many as possible) link partners and inform them of new domain and ask links to be switched (for existing links) and updated (for future links) to the new domain. Even though 301's will redirect link juice, the actual link to the new domain page without the redirect is preferred. Track rank of targeted keywords, overall domain importance and GPR over time to ensure that you re-establish your Mojo quickly. That's it! Ok, so everyone, please give me your feedback on that process!! Secondly, as you can see in the middle of that process, the "implement 301's" section seems easier said than done, especially when you are redirecting each page individually (would take days). So, the question here is, does anyone know of a way to implement bulk 301's for each individual page using IIS? From what I understand, in an Apache environment .htaccess can be used, but I really have not been able to find any info regarding how to do this in bulk using IIS. Any help here would be GREATLY APPRECIATED!!
Technical SEO | | Bandicoot0 -
What should be noindexed on a Wordpress blog?
I know this can be a "it depends" answer so I'll try to explain. Qualifications on your answers would be great. I use the Wordpress architecture for myself and clients on sites and blogs. Almost every business site we create has a blog and I'm always working to improve results on them. My strategy has been the following: Categories: General, main content types, general keywords. Index, follow Tags: Very specific, post specific, may only be used once for one post. My categories have descriptions that are displayed on the category pages with excerpts. Tags rarely have a description but are displayed with excerpts on the page. My idea has been to index the categories to crawl the content and they have unique content by showing the category description. Tags shouldn't be archived because they may be all over the place and may have only 1 post with no tag description. I'm trying to reduce duplicate content but I don't want to limit results for my clients and myself. Should I set tags to noindex, follow or should I have them indexed? The only thing I'm thinking with having the tags indexed is that I may be able to get additional traffic through the more specific tags (i.e. tag = meta tags, category = SEO).
Technical SEO | | JaredDetroit0