Good day doctorSIM!
Actually there was a really great post up on Moz last month about this very thing- http://moz.com/blog/hreflang-behaviour-insights
Enjoy!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Good day doctorSIM!
Actually there was a really great post up on Moz last month about this very thing- http://moz.com/blog/hreflang-behaviour-insights
Enjoy!
Greetings Samantha,
Combining all your sites into one has several advantages, for example-
Having a main group site with individual locality pages within it is definitely doable from a local ranking perspective. We have a franchise client who has over 50 brick and mortar locations in 3 states and we are able to rank them locally with the same amount of effort you would put into a separate site (from an SEO perspective). You'll want to make sure, at a minimum, you do the following-
As for a recommendation, I would say do what fits best for you. It seems from what you're saying there are some financial benefits to merging and there are no SEO hurdles to prevent you from doing so. Good luck!
If you are using rel canonical then you can have the same on each page and it should be okay.
Otherwise, I would make sure your paginated pages don't have it. The next/prev helps Google to understand these are subsequent pages of the original category but it doesn't really give instruction as to the preferred page, etc. (like the canonical would) so you could end up with Google ignoring the content after it sees it too many times.
Yes it doesn't surprise me you'd be having problems on those higher page categories. Testing is always the way to go when in doubt. Out of curiosity, what e-commerce system are you using?
I personally prefer the slash but it doesn't make any difference as long as you're consistent and if as you say Google is already indexing most without, I'd probably go that way too!
Greetings Oren,
Good question. We use this tactic a lot with clients and on our own website but we don't necessarily include it on every page or post in an attempt to reduce blindness to the form. We do it quite often though. The key is to make sure you don't just thrown the form in there but actually call the reader to action in some way. ie: "For more information on (what the post topic is about), sign up for our monthly newsletter below:"
If definitely improves conversion rate (although I don't have any specific numbers for you) and with the right hook in the call to action is very effective in lead generation, and also classifying your leads into buckets of interest for different types of email campaigns (or whatever).
Greetings alrockn!
You do have quite the dilemma here. I actually think you will have problems if you leave it all as-is; you're between a rock and a hard place!
Most e-commerce programs do a terrible job on the technical SEO front out of the box and require some degree of customization to get it all straightened out. The pagination of category pages is a very common problem. I will take your word for it that you cannot modify your template(s) but any reasonable suggestion I think is going to require some degree of template modification.
The problem you're most likely going to run into is a thin content issue on your category pages. I'm assuming all of those paginated page versions would also have the same category description (if any) and if there is nothing unique about your main page Google is likely to ignore it.
To address your question on hard coding the first page as the canonical, I think that is really the only option you have. You'll want to make sure that category page does have some level of unique content on it (ie: category description text) so it is unique enough to attract Google's attention.
Could you not do some conditional coding to check the page version and modify the canonical accordingly?
Good day MozAddict!
SEO for Magento is near and dear to my heart. From a technical SEO perspective, I would recommend cleaning up the items you mentioned as it can cause issues. The biggest concern is trust flow and having trust split between two versions of the page (ie: the slash and no-slash).
So both the 301 and canonical tag will pass the same amount of trust as the other. So your question is, which do you go with? I think both are fine however I prefer the 301 myself for dealing with the trailing slash issue and here's why.
As time passes, believe it or not, people will link to some of your pages naturally. Because a canonical or 301 doesn't pass the full trust earned from the link, I'd rather someone link to the correct version. If I'm using the canonical tag, they may indeed link to the non-preferred version and I would lose some of that trust, whereas if I am using the 301, they will automatically be shown the correct, preferred version and I earn all the trust from that natural link.
Moz has a great article on canonicalization if you want to read more on it.
Hope this answer is useful to you!
It should work with posts, pages, custom post types, etc without needing that plugin.
Good luck!
I don't host my WordPress sites on IIS but according to their site, you have three choices for custom permalinks on IIS-
Microsoft IIS 7+ web server with the URL Rewrite 1.1+ module and PHP 5 running as FastCGI
Microsoft IIS 6+ using ASAPI_Rewrite (free for single-site server, $$ for multi-site server)
Microsoft IIS 6+ using Ionic ISAPI Rewriting Filter (IIRF) (free for single-site or multi-site server)
I got the above from this page.
I checked Moz's Open Site Explorer and Ahrefs which are both good sources of backlink data.
Structured data is a nice thing to have but wouldn't necessarily hurt your ranking. It can help Google more easily make sense of your content and also help you stand out a bit more in the SERPs if Google chooses to show a rich snippet result for you.
Getting rid of bad backlinks can be a manual task of reaching out and contacting webmasters and there are some tools that can make that process a little less time consuming. However, I didn't review your links for quality, just noting there were a lot of links from a small number of domains. The top one looked like the personal site of the company owner or broker.
Just a quick note, adding it to robots.txt instructs the crawlers not to crawl the URL, but it can still be indexed if it is being linked to from other places (and probably is if Moz crawler is finding it).
The easiest way to solve the problem if you don't want them indexed is to edit your search results page template and add a meta noindex to the .
I personally don't like to see search results surfaced to search engines (unless there's a strategic reason for doing so).
Greetings,
I don't know that it is related but recently we were claiming listings for a client with 50+ brick and mortar locations and after a month or two of having them claimed a handful of them went back into pending. I believe it is likely a bug or something. The easiest thing is to just re-verify them however if there is an issue with calls or sending a postcard you'll have to call their support line...and wait a very long time (relatively speaking for the web).
Greetings James! Welcome to the fun-filled and often onerous world of SEO.
Just taking a quick look at your site, you seem to have a lot of content pages, etc. but very little in the way of trust flowing to your site. For example, I can see you have thousands of backlinks but they're only spread out over forty or so unique domains.
As real estate is often local, you'd do well to try and rank in the local pack results for real estate related searches, however I notice a Google+ page that looks like your's but it doesn't have the same address as your website. You want that to match up and then also start building credibility to your Google+ page through positive reviews, etc. from users.
Real estate is a super-competitive niche and your best bet is to (at least until you have more trust) target more of the long tail of search. Those are just a few tips to get you started but anytime you're doing a competitive niche in a big city / region, it's not going to be a quick and easy task. Keep at it though; you'll get there!
Greetings! I think you're going to be better off creating an evergreen page that you can build trust to over time.
Because blog posts are off time-stamped Google may tend to ignore it as it gets stale. The exception would be if you were to, as Jimmy is suggesting, create a category page or something similar with unique content AND an aggregation of recent blog posts in that category. But you'd need both to keep it ranked over time.
Hello!
Is it just you have both www and non-www versions of all pages that are resolving? If so you can add one 301 redirect rule in IIS to redirect all of them from one to the other and solve the problem. If not, feel free to provide more detail and I or someone else can chime in.
EDIT
I just took a quick look and it looks like that's part of the problem. Follow the above and it should take care of it. I also noted the non-SSL version is 302 redirecting to the SSL version. That is an incorrect implementation. You want that to be a 301 so if someone links to the non-SSL version you get credit for that link juice.
Cheers!
I think it is possible to add 1000s of pages of unique content over time.
Excluding (I assume that's what you mean) the layered nav pages in robots.txt won't necessarily keep Google from indexing those pages, only crawling them. So you end up with something like this in the SERP - http://screencast.com/t/SPKDV09SM9 (Note the meta description.)
If you didn't want them in the SERP you'd need to add a meta noindex to the of the page as well.
I don't really like splash/doorway/gateway pages like those from a usability perspective. If I'm going to drive traffic to a page on my client's e-commerce site, I want that page to have active product on it. I realize you can do some Magento hoodoo and get product on static pages but it's not worth the effort in my opinion. You're better off focusing on conversions from your layered nav pages with unique content.
I like #5 for the reasons you've stated. Also keywords in the URI string aren't as strong a ranking factor (in my opinion) as they used to be. My 2 cents.
Greetings Pamela!
This is nothing to worry about at all. UTF-8 is simply a type of character encoding and is set in the websites to instruct web browsers on how to interpret the character encoding. See- http://screencast.com/t/s4I2RNsgqUh
As it's not negative and perfectly normal, there's no need to change it at all.
Hello! Well this is one of the last messages you want to receive in regards to your website.
Google indicated they have applied a manual action to your site (Google Penguin is an algorithmic action, not manual). Within the email they gave you a basic set of marching orders on what you're going to have to do, although they don't make it seem as onerous as it really is. We have had quite a number of clients come to us with link-related problems and I will tell you it is a pain.
Your best bet for removing the bad links is to use an automated tool to help you identify the worst links. Some tools also include a way to gather contact info and keep track of link statuses for you, etc. which is convenient, particularly when you're looking at the number of links your site has.
To make sure you get a full list of links, consider downloading lists from several backlink providers (Open Site Explorer is one), de-dupe your list and use that as your master list. Any links you can't get removed you will want to add to a disavow list and upload to GWT (although my opinion is this doesn't really do anything to benefit you other than show Google you're trying).
When you file your reconsideration request, unless you've been extremely thorough you can expect them to reject it outright. I repeat, you have to get that link profile cleaned up!
Google has indicated in some instances, it may be better to start over with a different domain (not necessarily my opinion in your case, just making you aware).
Hello! Great question; I've worked with Magento quite a bit and layered navigation can definitely be a bit tricky from an SEO perspective.
The most important thing to opening up some or all of these pages to Google for consideration, is having unique content on the page. This is not something Magento themes typically include by default but will be 100% necessary to get most if not all of these pages to rank well.
Most typically we add a static block to the top or bottom of the page to add uniqueness. Without it, all of the content on these types of pages are almost always 100% non-unique and from my experience Google ignores.
If your layered nav pages also use pagination, you'll need to make sure you follow best practices for using canonical and/or rel prev/next/view all to make sure Google isn't looking at your paginated pages as extra duplicated content.
And as a final reply, I think it can be a great idea with limited consequences as long as you follow the above advice regarding unique content and pagination. Additionally there's no worry regarding Pagerank or destroying the world.
Good day! Have you read over Google's guide to Quality Score? It is pretty useful in understanding how they're calculating it and where you may be falling short-
https://support.google.com/adwords/answer/2454010?hl=en
Scroll down to "How we calculate Quality Score". Good luck!
I believe it should be-
RewriteRule ^category/latest-news/(.*)$ http://yourdomain.com/latest-news/$1 [R=301,NC,L]
Try that and see if it doesn't fix it for you. (Replace 'yourdomain.com' with your real domain of course.)
I can confirm what you've said and mentioned in your comment.
My first thought is perhaps Google is choosing not to index it because it finds it too similar to these other sites-
Those sites are ranking for the phrase "ohio virtual academy" but not the one that is the subject of this question (at least on the first few pages).
Additionally, you don't have many quality links to the domain which is another signal Google may be using in its decision.
Is your .htaccess file writeable by the server? If not your permalink (and plugin) settings won't work.
Good day! That can be a really frustrating experience, particularly when you've worked hard to write your titles. Alan Bleiweiss gave a good answer on here in the past and it's still relevant - http://moz.com/community/q/why-is-google-changing-my-title-tags
Hello!
I wouldn't consider it to be very serious, however if you wanted to nip it in the bud, the best way would be to block /Account/* in your robots.txt and also add a meta noindex to all of your /Account pages.
The robots.txt directive asks compliant bots to not crawl the page(s) while the noindex requests search engines to not index those pages.
I don't know that that would be the cause in your rankings drop but it isn't helping you in my opinion. You could try removing it. Have you fully investigated whether the rank loss could be related to Panda or Penguin updates?
The biggest concern in my mind would be possibly having duplicate content issues and a non-desired version of the page(s) indexed versus what you want.
I think if you have a lot of links to the non-SSL version and are using your SSL version as preferred (or vice versa) you're living a lot of link equity / juice on the table and that when you fix this issue you may see a nice jump in overall trust/ranking once you fix the issue and correctly 301 the non-preferred to preferred.
Hello! This shouldn't be a problem. Assuming you are using the #anchor as an in-page link (ie: jump to certain content/section) Google mostly ignores the in-page #anchor except in some limited cases where you can get a link like it in the search result. If you're manipulating your content using the anchor or some other thing then my answer might change.
Good day!
I don't think adding the canonical to your hyperlinks is going to accomplish what you want. All of the direction Google gives is to add it as a in the of your page ( https://support.google.com/webmasters/answer/139066?hl=en & http://moz.com/blog/rel-confused-answers-to-your-rel-canonical-questions ).
From a technical web development perspective, when a rel attribute is present on a hyperlink, it "...describes the relationship from the current document to the anchor specified by the href attribute..." ( http://www.w3.org/TR/html401/struct/links.html#adef-rel ). That being the case, a canonical would only make sense in this relationship where the link actually appears on the canonical versions.
Hope that helps!
Yes you are 100% right in my opinion. The whole reason I believe for the shift in brand preference is a play on generating more ad spend via Adwords. The search results are so unhelpful for so many searches nowadays.
For example, search for "Trademark Registration" on Google. That SERP used to be full of attorneys and websites offering that service. Now it's mostly government websites forcing the attorneys to buy ads for this highly searched term.
You're most welcome, glad you found it useful!
If the content is irrelevant I wouldn't redirect it to specific pages on your site. If it is generally still relevant to the business as a whole, you could 301 redirect it to the root URL. If not, at the very least, if some referring visitors might become customers, you could change your 404 page to offer up calls to action or shopping options for the folks who see it.
My 2 cents.
I wouldn't worry thought about removing the links unless they're very poor quality and their percentage is high compared to your "good" links.
If you're wanting to check to make sure it exists on a page as a sort of audit, the Screaming Frog SEO Spider has custom filters you can use to check for code on a page and then sort by those that are missing it.
I would suggest removing it as that page sends several signals that could be interpreted as negative by Google, et al-
All of the above independently are not probably a deciding factor, but if you combine them all it paints a picture that isn't positive for your client.
I'll give you my opinion but it isn't a definitive answer.
I believe what you're seeing is a Google SERP test and they are pulling that from the content on the manufacturer's page but I agree there is no rich markup on the page like Schema to point Google to use it.
A quick search of lacrosse monkey
shows me no Google ads on the page which is a pretty sure signal Google recognizes this manufacturer as a brand entity. That in and of itself would suggest Google might also understand the products as entities and be pulling content from the product page and testing it in the Knowledgegraph to see how searchers interact with it.
Unfortunately I don't think there's much you can do about it.
The file name isn't as important for SEO as it used to be. It can provide a useful signal as to the nature of the image's content and the text content around it. It shouldn't cause any "issues" or penalties with your SEO, however, you may lose the side benefit of some/all traffic from image searches.
The biggest concern I would have is if the manufacturer decided to move, delete or rename images.
Hi Nick,
Unfortunately at this time there isn't a way to specify a shorter meta description. The only way to achieve what you're looking for would be to shorten all meta descriptions for sub-pages but that means you'd lose a lot of real estate if it showed up in a normal SERP.
Darn!
Another alternative would be to use Screaming Frog to get a full list of URLs from each site, then use a scraping tool like Mozenda to scrape that list from each site, pull the content area and it will create the data structure you want and make it available for export. Then you can basically do what I had said in the previous email, compare the two spreadsheets.
Screaming Frog SEO Spider could do that for you. You'd need to set up a custom filter to look for a copy identifier (ie: a div that always contains the main copy) and have it scrape that for you while it's crawling. Do the same for the other site and then you could match them up pretty easy I think.
Here is a good resource on different ways of using the tool - http://www.seerinteractive.com/blog/screaming-frog-guide We use it almost daily for a variety of tasks and find it to be pretty flexible. Good luck!
Your best way of handling that is to canonicalize all pages to a "View All" page for the search engine and use rel="prev" and rel="next" for your "Next" and "Previous" links respectively.
That's the best way to handle pagination if you're going to use canonicals. To answer your question directly though, the paginated pages should not canonical themselves.
Another solution would be to noindex,follow pages 2 - n.
Hope that helps!
It may be difficult, particularly if the .com is also a well established brand in the UK. That being said though, it isn't impossible to have your client rank in the SERPs as well for the same brand if you can establish them as a brand entity with Google.
For example, here is a search I did for Peak Auto, a well known, established brand in the automotive supplies industry- http://screencast.com/t/ycYugMK876 You'll see there are two other local results in the mix of primarily Peak Auto results.
Another example is to search for "Merck" in the .co.uk Google. Some results will be from the US-based company of that name, the other results will be the German company of the same name. And interestingly both are drug manufacturers.
So it may not be an easy task, but it certainly isn't impossible.
As Dana said, this is a pretty typical issue in e-commerce, particularly for merchants with large numbers of SKUs that are pretty much the same thing with very slight variations. Think about trying to sell something like light bulbs.
In the recent past we have basically reinterpreted an old-school, spammy SEO technique that actually has a pretty useful, legitimate purpose in this scenario. Spinning text. You can use the variables from your product to create "unique-ish" content that will likely pass a duplicate content sniff test.
For example, there are many ways to write your call to action and it could be written like the below and run through a content spinner to produce different variations--
{{Call|Call us|Call Today|Telephone us|Ring us}} for {{the best|your|our best|daily best}} price 615-406-3255
Disclaimer: I'm certainly not endorsing this as a way to fool the search engines, but sometimes trying to get enough unique content to interest the algorithms takes a little creative effort.
If it were me I would go with direct line or extension and make sure all citations were formatted exactly the same and reserve the main, primary number for the business entity.
It depends on if this is a single-location brick and mortar or one of many branches. If a single location, go for the home page. If multiple locations, to the profile page of that location on the main website. Same for the individual person entity, it should go to their personal profile page on the website.
So those are some thoughts of mine.
Ah I love Raven. But which part in particular? Is it being pulled in from your Analytics? Can you share a screenshot of what you're looking at?
Link reclamation is a pretty common practice in SEO. I believe generally, as long as you are following best practices (ie: avoid Penguin issues) and not being overly aggressive you will be okay trying to reclaim broken and/or lost links. Links appear and disappear to sites all the time as the Internet ebbs and flows. Consider a blog home page. As new blog entries with external links cycle through the page, links come and go on a regular basis.
Just make sure the sites you are reclaiming links on doesn't have it's own red flags in its link neighborhood.
A quick question: Where did you discover this page? Using a tool like Xenu or Screaming Frog or in Google Webmaster Tools? Other? That'll help narrow down the issue.
Oftentimes WordPress or WordPress plugins will spawn various URI iterations (especially with URI variables) that you may not be aware of.
If you're looking to redirect ALL sub-pages to the new main page then you would use mod_rewrite with your .htaccess file something like-
RedirectMatch 301 ^/blog/dental-tips(.*)$ http://floridadentist.com/dental-tips/
If you're trying to redirect to the matching URI just under that new sub-directory this would be more appropriate-
RedirectMatch 301 ^/blog/dental-tips(.*)$ http://floridadentist.com/dental-tips/$1
The name of the post type only matters for your menu items in admin but yes it is an issue sometimes. You could call it "News Item" and "News Items" or just go all grammatically incorrect with "Media Coverage" and "Media Coverages"