In my limited experience, 301's will redirect +1's, as Arjen said.
For example, this page http://moz.com/blog/how-to-rank gained most of it's +1's before a 301 redirect, and those plus 1's are now associated with the page.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
In my limited experience, 301's will redirect +1's, as Arjen said.
For example, this page http://moz.com/blog/how-to-rank gained most of it's +1's before a 301 redirect, and those plus 1's are now associated with the page.
Typically, if I understand correctly, the number of +1's on your Google+ profile closely reflect the # of +1s associated with the homepage of your website.
What's interesting is that Google reports 16 +1's to your homepage through thier API, which we can see through Open Site Explorer: http://www.opensiteexplorer.org/pages.html?site=www.ibremarketing.com%2F
You have everything set up correctly as far as I can tell. Not sure why there is a difference. It could be because of caching or differences in the way the widget reports +1's from the web app. Regardless, the difference isn't enough to cause concern.
As far as helping you in rankings, to be honest the effect is likely small at this point. We know Google uses G+ to index new content, and it can play a significant factor in search personalization. At this point I would work on promoting yourself for more visibility, and eventually you'll may see the influence of Google+ shares.
Hope this helps! Best of luck.
Hi Hal,
Great questions - thanks for writing in.
In addition to optimizing your pages for specific keywords, there's one more thing you want to consider - What's the best title you can write to get both clicks and conversions?
If you have a brand new site, it's unlikely you're going to rank very high for a competitive term, but you want to get the most out of every impression that the search engines give you. This is even more important on the homepage as it will likely get the most impressions.
If goat milk soap is your flagship product, it might be wise to work it into your homepage, although it doesn't nessarily have to be front and center.
Let's take your existing title....
**Truly Natural Moisturizers & Skin Care Products | Alabu Skin Care**
1. The first thing I notice is "Skin Care" twice, which is probably unnecessary.
2. I like the "Truly Natural" - it's something I might click, although I think there's room for improvement
3. Let's see if we can add Goat's Milk Soap
So some possibilities:
"Obsessively Natural Skin Care Products & Goat Milk Soap | Alabu"
The problem with this one is it moves the target keywords too far to the end, where they have less effective power.
"Alabu Goat Milk Soap & Skin Care Products - Truly Natural"
Now it feels like we're getting somewhere. I'm sure you can improve it way beyond this, but you get where I'm going.
As for your second question, which page to optimize for Goat Milk Soap, I think you're right on the money wanting to optimize your category page for the broad term, and get more specific with the category pages. Keep in mind Google may or may not follow your wishes. You may get a stubborn product page that ranks for what you intended your category page for, or vice versa. You'll have to experiment. Inbound links and anchor text play a large roll in the final determination.
Hope this helps! Best of luck with your SEO!
Hi Catfish,
Technically, www.poker-coaching.net and www.poker-coaching.net/ are 2 separate URLs. One with the trailing backslash, the other without. To be perfectly compliant, you would want to make sure that one redirects to the other. You can use a tool like this: http://www.seoconsultants.com/tools/check-server-headers-tool/ to check. For example, see how the homepage of SEOmoz does a 301 redirect when it contains a trailing slash - http://www.seomoz.org/ redirects to http://www.seomoz.org
That said, Google has said this isn't that big of deal. So fixing it would make you technically correct, but it probably won't have that big of difference.
On the the penalty... I see 2 potential problems.
1. Huge number of links on each page. The drop down navigation is huge. I've seen Google crack down on sites like this with lots and lots of nav links above the fold. I'd cut those in 1/2 and see what happens.
2. Meta Refresh. You have a lot of affiliate links that redirect via meta refresh. I see that you have blocked these pages with robots.txt, but you might want to add rel="nofollow" to those links as well.
3. A lot of your backlinks are from poker related sites (which is good, in a way) but it is possible that they were hit by penalties as well. If the folks linking to you were devalued, you suffer as well. I'm not saying this is definitely the case, but building a more robust link profile never hurt.
Hope this helps! best of luck.
Hi Vincent,
Great question! The reason you are getting those duplicate content errors is because of the HTML similarity of your product pages. Even though they have different title tags and content, there's not enough content differences from one page to another to fully distinguish them.
That said, Google is a bit more sophisticated at determining duplicate content, but in the age of Panda, they might view the content on these pages as "thin" In general, it's nice to have at least 250+ words of rich unique content on every page you are trying to rank for.
http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
In general, you don't want to canonicalize those product pages, as these would effectively drop them from the index. Instead, the ideal solution would be to add more content to those pages, above the fold, and beef up the descriptions. Even if this is unreasonable, I would prefer the duplicate content errors in this scenario than canonicalizing everything to 10 pages.
Hope this helps! Best of luck with your SEO!
With meta descriptions, if often dealer's choice. If you don't provide a meta description, Google's search snippet will pull the relevant text from your page that matches the user's query, which sometimes may help click-through rate.
On the other hand, a well crafted meta description can also help attract the right kind of clicks.
In this case, however, I fear it was just plain oversight. We update this page a lot, and I believe the meta description simply got lost in one of the many development cycles. So thanks for the heads up!
Hi Hal,
Hope you don't mind me taking a stab at this question also.
Blocks of duplicate content, whether they be images or text, repeated among several pages is often known as "boilerplate" material.
Boilerplate material is okay, but if you have to much of it, without enough unique content to balance it out - that's when you get in trouble.
In general, you want to make sure you have a minimum amount of unique content on each page. My own personal rule of thumb is at least a couple hundred words - but it varies from situation to situation. if you have a lot of boilerplate, you want to make your original content front and center. This means placing it near the top of the body copy, above the fold and first in the HTML. Then place the necessary boilerplate info after this.
It's best to try to generate as much unique content for each product as possible. You can do this by including reviews, how-to videos, testimonials, or whatever value-added content you can dream up, as long as it's quality content and serves your customer.
Finally, make sure when "optimizing" the page (things like the title tag and meta description) make sure to focus these elements on the unique content of your page, and not the boilerplate material, so as to not dilute your rankings.
Best,
Cyrus
Often times folks overlook the drop down links in the navigation. These count as links and are important not to ignore as it's easy to go overboard on these. On your site I counted over 100.
But to answer your question... If I understand correctly, does SEOmoz count links pointing to on-page anchors as links? I believe the answer is yes. In most cases, there are few enough anchors it doesn't present an issue. But in your case it might inflate your count and make it seem like there are more out-pointing links than there are.
The big question is, does this influence how Google or other search engines treat the issue. I'm not sure anyone knows the answer to that. I suspect Google is more than sophisticated enough to note the anchor (they use these when generating sitelinks in SERPs, determining semantic relationships and other reasons) but don't count it against your crawl allowance.
So, to summarize, if it's only on-page anchors causing you to have an inflated link count, then you probably don't have much to worry about. But if you have other, real links causing this, you may want to trim down your top-heavy navigation links.
Howdy,
Hmmmm... I have to wonder why you changed the domain name in the first place? There are certainly many legitimate reasons to do so, but it's a pretty drastic measure to do a domain migration and most of the time it's something most webmasters want to avoid unless they have to.
If the domain is close to a popular domain, you may continually run into the problem of Google trying to correct your spelling when folks search for your brand name - especially if the brand doesn't have a lot of visibility.
For example, if I registered the name mfacebook.com, Google would likely try to steer me towards Facebook time and time again (not to mention Facebook hitting me with Trademark issues
Will the domain name hurt your SEO? Probably not anymore than the normal risks involved with migrating a website to a new domain.
The bigger question that I think you want to address is: will Google recognize the keyword in the domain? Although Google does a pretty good job at discecting domain names (for example, they know that bluewidgets.com is "blue widgets") the algorythm isn't perfect. SEOmoz.org is a perfect example. For years Google didn't give us credit for the keyword "SEO" because "moz" wasn't a common word. So instead, we were clumped together in the "SEOmoz" (single word) category.
My best advice, if you want to take advantage of keywords in the URL for an existing domain, is to use them sparingly in subdirctories and file names, such as example.com/keyword.
Regardless, keep us up to date on your progress. Best of luck with your SEO!
Hi Micheal,
First of all, congratulations on your endeavor. Looks like you've done a lot of hard work and you should be congratulated for your efforts.
Finding a good SEO firm can be both challenging and rewarding. The range of talent out there runs from super to snake oil salesman. Whoever you hire, be sure to do some research. Ask for a list of clients and recommendations (sometimes, for practical reasons, they can't provide this) Talk to folks who have used the service. Ask about certifications, training, philosophy and influences. And most importantly, ask about the techniques they will use.
A good place to start is the SEOmoz Recommended list, although most of these folks might be out of your price range.
A lot of good SEOs hang out here in the Q&A, and if you check their reputation you can often find a good "up-and-comer" that doesn't charge an arm and a leg for SEO services.
First and foremost, I'd look for a firm experienced in link building, especially in your situation. Wish I could suggest someone specific, but you fall in that tight area between doing it yourself, and hiring an outstanding firm.
Regardless, best of luck with your SEO!
Hi James,
Couple of points:
1. The term "highly competitive" might be a bit misleading. I'd pay more attention to the score of 53%, which I would consider only moderately competitive. If you look at the actual search results, the #1 ranking only has a Domain Authority of 19, so I would think this is a keyword you could shoot for.
2. The score takes into account how hard it would be to obtain a top ranking, but not neccessarily #1. There are some high authority domains in the top 10 for that query, including www.swedishdigitatalmarketingagency.com, which makes the overall vertical more competitive.
3. The Keyword Difficulty tool is a best guess based on comparative metrics, but it isn't always right.
4. When in doubt, I always run a full analysis report to see why URLs are ranking the way they do.
Overall, I trust the 53 score. It seems a little high, but it's within a range I would consider within reason. The good news is the Keyword Analysis tool is going through some changes soon as SEOmoz acquires new data sources and continues to improve it's scoring algorythm. In the meantime, you can always use the MozBar - with the SERP overlay turned on - to eyeball any results you'd like to examine.
Best of luck with your SEO!
Hi Cristian,
Sounds like a bug in the software. Thanks for letting us know. Do you have any specific examples the engineers can look at?
I'm going to ping the SEOmoz help team (help@seomoz.org) and let them know. Again, thanks for the heads up.
"...by the time you told someone the address you where exhausted and the prospect was confused"
Best answer I ever heard!
Hi Iain,
Great question. Like a lot of things in SEO, the answer of whether or not to use a ccTLD is... it depends.
Pros
Using a country level domain can boost your SEO targeting in that specific area, may help with click-through rate depending on local biases, and sometimes it's easier to create a country specific "brand" with this approach.
Cons
You have to do SEO for each site. Your Italian site may get links, which doesn't help your German site. Duplicate content becomes an issue if you publish the same articles on multiple sites. Maintaining muplitple sites becomes harder.
Questions to Ask
Will you publish content in multiple languages? If so, will you publish multiple versions of the same content? If so, you will want to make use of the hreflang tag.
Are you targeting specific countries, or multi-country regions? If it's the latter, than a single domain may be more efficient.
I'm not a true expert on international SEO, but one of the best overviews of the topic I've seen is this WBF by Rand: http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday You may also want to check out this post: http://www.seomoz.org/blog/international-seo-dropping-the-information-dust
Regardless, hope this gives you something to think about. Best of luck with your SEO!
That's would be an awesome feature. Unfortunately, the only way to get a report like that is if you don't have any competitors chosen in your campaign set up. If you do have competitors in your campaign, there's no way to remove their link data from this report.
If you're interested, you can submit a PRO feature request. Other members vote them up and down and this is how the product team decides what to build.
Regardless, thanks for the heads up and letting us know this is a functionality you'd like to see.
Best of luck with your SEO!
Howdy,
Generally, it's a bad idea to target keywords not in your content. If feels spammy, you're not likely to rank for it, and it's likely to have a negative impact on user experience.
Imaging clicking on a search results about "Competitor X Keyword Keyword" - and then the landing page has virtually nothing to do with my search. Well a couple things will happen...
I think you get my gist, but let me address your specific points.
1.) Does it hurt or help to load up the keywords with misc. keywords
Again, it sounds spammy, and keyword stuffing is a tactic that hasn't worked well since 2007.
Lessons Learned from an Over-Optimizer
2.) Any suggestion for almost blank/generic landing pages ?
Add useful, relevant content.
Duplicate Content is a Post-Panda Word
3.) Any benefit or penalty for using fewer keywords on multiple pages ?
A general rule of thumb is optimize your page for keywords so that 80% of searchers using those keywords have the same general intent. If 70% are looking for one thing, and 30% something else, it's best to split those into 2 different pages.
It sounds like your customer should focus more on conversions instead of raw traffic. It might help to guide the client to go for quality of traffic instead of quantity, which usually pays better dividends.
Finally, a couple of more resources that might help understand on-page optimization. I recommend you take 30 minutes and read/watch them all!
http://www.seomoz.org/blog/getting-onpage-seo-right-in-2012-and-beyond-whiteboard-friday
http://www.seomoz.org/blog/4-graphics-to-help-illustrate-onpage-optimization
http://www.seomoz.org/blog/perfecting-onpage-optimization-for-ecommerce-websites
This should give you plenty of ammunition for your talk with your client. Besst of luck!
Hi Sophie,
The On-page reports are supposed to generate automatically for any keyword ranking in the top 50 of your primary search engine. (If this isn't the case, it could be that your keywords are ranking, but not for your primary engine, or there is a glitch in the software.)
Fortunately, there's an easy way to generate a weekly on-page report for any keyword you want.
For example, your "Contact Us" page ranks #5 for the keyword phrase "Yellow Shoes" and the Web App grades it a B. What if you really want to target your homepage for this keyword phase? Here’s how:
1. Hit Report Card at the top of the On-page summary.
2. Choose your Keyword you want to grade. The keyword must already be included in your campaign. Select Manage Keywords if you need to add keywords.
3. Enter the URL of the page you would like to grade.
4. Hit Grade My On-Page Optimization to generate your report.
You can read more about it here:https://seomoz.zendesk.com/entries/20034412-on-page-reports
If you're having trouble, feel free to contact the Help Team (help@seomoz.org)
Best of luck!
Hi Branagan,
Great question! A big discrepancy between PageRank and page specific Moz metrics can indicate a penalty or algorithmic suppression of PageRank. That said, this is only one reason the difference can exist and it's best not to jump to conclusions without a deeper investigation.
The best thing to do is to examine the inbound links to the domain (using OSE or another tool of your choice) and try to find a reason why the site might be penalized.
I'd also check several pages across the site. Does the homepage also have a depressed PR? This might be a better indicator of a penalty than an interior page, where the flow of PageRank is less predictable.
So it's best, when you find this differences, to use them as a starting off point for further investigation, but don't rely on them exclusively.
Hope this helps! Best of luck with your SEO.
Ranking signals are extremely complex, and On-page SEO is only a fraction of the pie (although sometimes it's a very large piece!)
For a broader perspective, run your keyword through Moz's Keyword Analysis Tool - then run a "full" report. This will show you dozens of on-page and external ranking factors for the top ten websites returned for that search query.
.... and this still isn't the full story.
Questions like this help us learn how Google's search algorithm work, and why some sites rank higher than others, even when we don't expect it. Keep questioning everything!
The quickest way to do this is using the "Top Pages" report in Open Site Explorer.
Simply enter the root domain of the site you want to research, hit "search" then navigate to the Top Pages tab. This will show you the number of lining root domains to each internal page.
Here's a screenshot:
https://skitch.com/cyrusshepard/8baj7/open-site-explorer
+1 For Doug's answer. You can download the standard Inbound Link report from OSE. Make sure to filter for "external" links to "all links on the root domain" (or subdomain)
Open your report in a spreadsheet, The "Target URL" column will show all the URLs on the domain with an external link pointing to them. Here's another screenshot:
https://skitch.com/cyrusshepard/8bakm/microsoft-excel
Hope this helps. Best of luck with your SEO!
Hi Emma,
I feel your frustration. Unfortunately, we don't know all the ways in which Google's algorythm works, although we have a pretty good idea of many pieces of the puzzle.
Even the highest correlated metric we know of, PageAuthority, only has a correlation of about 0.35 (1.00 being a perfect correlation). Pretty good for SEO, but in the real world it's not the best correlation.
So the factors in the keyword difficulty tool are known ranking influences, but it's impossible to incorporate all 200+ ranking signals (some known, some unknown) into a single tool. Instead, the best way to use the tool is to use it to try to find out exactly why a page ranks above another. Is it over-optimized? What is it about those social signals that help? Are the links from relevant sources? Has the site been penalized?
Yeah, it's conflicting and confusing. In truth, the first 80% of all SEO is pretty standard: Create good content, make sure it's accessible by Search Engines, Follow best SEO practices, market it smartly, get links, repeat. Do this well and you'll win most of your battles. The remaining 20% gets hard, and if we think about it too much, we sometimes waste our time.
Regardless, the best strategy, in my opinion, is not to go after 1 keyword, but 100s at a time using a long-tail strategy. I wrote about it in more detail here: http://moz.com/blog/how-to-rank If you're just starting out, or even experienced, it's the best way to go.
Hi Ahtisham,
You mentioned both Duplicate Content and Rel Canonical errors. There might be some confusion because rel=canonical is not an error, but a warning. In most cases you don't need to do anything about it.
But it's good to fix the duplicate content errors. Yoast's WordPress SEO plugin is usually the tool to do this (although it doesn't work in every situation - depending on your Theme)
Duplicate Content in Wordpress is often caused by:
If these are creating pages that are very similar (mostly empty or repeating pages) you may want to stop search robots from indexing these pages. The Wordpress SEO plugin can help with that under the "indexation" tab. You can check the boxes to disable indexation of these categories - but be careful! You don't want to block something accidentally. This is a very powerful tool, so use it wisely.
Here's a screenshot of what I'm talking about - https://skitch.com/cyrusshepard/8et13/indexation-above-the-fold-wordpress
Hope this helps. Best of luck with your SEO!
Unfortunately, Moz doesn't provide historical info of metrics like DA or PA (although it's something we'd love to do in the future)
A favorite tool of many SEO's is Searchmetrics visibility tool. Although this only provides a rough estimate of a domains visibility in search results over time, it's proved valuable in detecting penalties and changes in search volumne for sites over time, although it works best with large sites.
Majestic SEO also provides historical link history. Access is limited with a free account, but paid accounts are generally reasonable.
Hope this helps! Best of luck with your SEO.
First, a disclaimer. I'm not an expert on SORBS-SPAM, so take my advice with a grain of salt. That said, with millions and millions of IPs on the list, including IPs that host big and powerful sites, it's doubtful that being on the blacklist alone will hurt your SEO much.
The bigger problem is any email you send may get blocked. Regardless, it's a good idea to try to get yourself de-indexed. Apparently, this is a fairly common procedure and you can find information on that here http://www.sorbs.net/faq/retest.shtml
Hope this helps. Best of luck with your SEO!
The value of links from all large directories like BOTW is debatable. We know that Google has devalued these links over the pass few years, but some webmasters feel they still pass enough value to justify the cost.
One thing is for certain: they are no longer the "must have" links like they used to be. You can group DMOZ and Yahoo directory in this category as well. If it's within your budget, these links certainly don't hurt and, of course, they are fairly easy to acquire.
In my personal opinion, (and this does not represent an official Moz stance) if you actively and successfully peruse other link-building opportunities, it may not be worth the expense of these types of directory links.
Michelle & Blake,
Tim raised some good points, so rather than address those I'll try to answer your question directly.
Yes, you could theoretically rank for these unique SKUs if you were to build some half-way decent content around them - especially if the competition is low as you say.
It's not likely as easy as putting the keywords in your Meta tags ( I assume you mean the meta description tag, or even the meta keywords tag which most folks don't use anymore)
If you really want to rank for these keywords, they should probably be a natural part of your content and body copy. To Tim's point, could you target these terms in a way that made sense to visitors? Perhaps yes. If your visitor was looking for a product similar too or a substitution for a particular SKU, this might make sense - but you actually have to create content around the terms, and not simply stick them on the page or hide them in the meta data - does this make sense?
You would want to target these keywords and ideas just like any other. Tools like the on-page grader might give you good idea where to start.
Hope this helps! Best of luck with your SEO!
Looks like you're ranking really well in local results for Charlotte Electrician, around #2. Here's a screenshot: https://skitch.com/cyrusshepard/8cht3/fullscreen
I also see this when I remove localization and personalization effects, so it's safe to assume you're actually ranking pretty well for this term, even if you're not listed in the "regular" results. Google has been constantly changing the way it displays and mixes local results with "regular" - but I'm going to step out on a limb and say that these days, your good showing should be considered equal to a regular result, for all practical purposes.
As for you ranking for "Electrician Charlottte", you may receive an F grade, but Google is more sophisticated at interpreting natural language, use of synonyms and intent, and obviously understands your page deserves to rank for this term even if you don't spell it out in concrete SEO defined terms.
All in all, looks like you're doing a lot of things right. Good luck with the rankings!
Both Ryan and Marie provided good answers, but let me elaborate further.
This is not a php thing. You can verify this yourself by visiting these pages.
Does the page content change when you visit /?lang=en&limit=5&limitstart=20 and switch to /?lang=en&limit=5&limitstart=20? If these look like the same page, then you have a duplicate content/title issue on your hands.
Google's take on the matter is simple...
"Provide one version of a URL to reach a document
To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel="canonical" link element if you cannot redirect"
You have several options to deal with this, depending on the content_._
This is a tricky area, and you must be careful when dealing with duplicate content, because it's easy to make a mistake and have Google de-index your content. That said, by correcting these errors, you just might see an improvement in your indexation and traffic stats.
Hi GoogleCrush
Sorry for the late response. I wont pretend to be an expert on this, but I'll give you my best assessment of the situation.
Generally, it's best practices on script heavy pages to include alternate information for degraded viewing experiences. Typically this is accomplished with the
<noscript>tag. You can see this effect if you turn off javascript (you can use the MozBar) for this and view the page. For this Walmart page, this view exactly matches Google's text cache of the page....</p> <p><a href="http://webcache.googleusercontent.com/search?q=cache:http://instoresnow.walmart.com/Kraft.aspx&hl=en&strip=1">http://webcache.googleusercontent.com/search?q=cache:http://instoresnow.walmart.com/Kraft.aspx&hl=en&strip=1</a></p> <p>Typically, this is fine and even considered best practices for usability. You can get into some trouble areas if the content of your degraded experience doesn't match your rich flash or javascript experience, but it's hard to tell where the line is drawn.</p> <p>As far as adding links to this area, yes, you can and should add links to the same destination as your flash experience.</p> <p>You can include images in this area, but without scripting I'm not sure how "Pin-able" they would be. Not sure I'm too clear on this question.</p> <p>As for video transcripts.... I'm really unsure about this point. From a usability standpoint, it might make sense to include a video transcript. That said, include huge paragraphs of text not visible to normal users could send the wrong signal. Best practices would be to include the video transcript for all users - the way SEOmoz does for Whiteboard Friday.</p> <p>Hope this helps! Best of luck with your SEO!</p></noscript>
Lots of excellent answers here deserving of many thumbs up, so I'll just throw in my 2 cents.
OSE generally reports anywhere from 40-90% of the links found in Google Webmaster Tools. Because of the emphasis on quality over quantity, it doesn't list every link. That said, the SEOmoz Linkscape team is working hard at delivering an index that may be 3x the size of the current index - which means a ton of links. W00t!
Check out this interview with Rand where he talks about this - http://pointblankseo.com/rand-fishkin
Regardless, it does seem odd that a site with good link metrics would have none of its outbound links counted. One possibility is that even though Linkscape knows about the high authority page through other inbound links (and can therefor assign metrics such as MozRank and MozTrust to it) the URL itself has blocked Linkscape from crawling.
In this case, you would see data for the high authority URL just like you described, but OSE would have no record of the outbound links on the page, because it wasn't allowed to crawl.
This is only one possible explanation out of many. You could check the robots.txt or meta data of the URL and questions and see if anything is blocked. It's hard to tell from this end without knowing the exact URL.
Regardless, I hope this sheds some light on the issue. Best of luck!
The good news is your homepage is ranking for a lot of words!
Unfortunately, you can't always optimize a page for more than a few key phrases, but often if the content and incoming links are rich in semantic hints, your content can rank for dozens, hundreds, or even 1000s of long-tail phrases that are outside your "top" keywords that you're actually trying to target.
(Just to be clear - adding more than one <title>tag is <strong>not</strong> a good idea.)</p> <p>One thing you might want to consider is if your homepage is ranking for search terms that it doesn't answer very well, and question if you should be creating other resources on your site to answer those queries. If so, it might be worth your time to do so and make sure you are linking to them appropriately with good site architecture.</p></title>
This is such a nuanced question that depends on so many different factors, I fear I can't give you a complete answer. But I can tell you some of the questions I'm asking in my head.
-What happened to the old para-sailing pages on Site B? Do they redirect to the salespage/homepage? Does the site still rank for and receive traffic for para-saling? What did those old pages rank for?
-Are you trying to rank the new site for para-sailing queries? If so, it might seem appropriate to redirect the old para-sailing pages to the next most relevant landing page. This is what Google would prefer, anyway.
-Do you take up 2 SERP positions now?
That's off the top of my head. Would probably have some more questions if I saw the site and knew more about the particular queries.
Unfortunately, this might possibly an attempt by a spammer at what is called a 302 hijacking. Google says 302 hijackings haven't worked in years, but many webmasters disagree wholeheartedly.
(on the other hand, it might not be this at all)
Basically, a spammer points a temporary redirect (check the redirection through a tool like URI Valet to see if it's a 302). Often if the site has a slightly higher authority, it can fool the search engines into hijacking your traffic. A deeper explanation here.
Another possibility is that your site has been hacked. Check for spyware, check Webmaster Tools and check to make sure that your domain is not the one being redirected. In Webmaster Tools make sure there hasn't been a change of address, and do a fetch as Googlebot to make sure your pages show up correctly.
Finally, you can fight back.
If it is an actual 302 hijacking, Google should take care of it fairly quickly.
Often, these aren't 302 hijackings at all, but are caused by internal site issues. Do a thorough site audit first before taking further steps.
The only long lasting way to rank for local specific pages is to offer truly unique content on those pages, and build unique links to those pages.
The two methods you mentioned here, using near duplicate sites and pages, may work for a short time or in non-competitive niches. It may also work somewhat if a very strong link profile is backing it up... but in general these sorts of tricks usually result in a drop in rankings. If not now, then during an upcoming algorythm change.
Often times, misguided webmasters think they are doing the right thing in launching these sites and pages, and no ill intent is intended. Unless the pages are obviously spam or doorway pages, then in my opinion it's probably not worth it reporting them to Google, but that decision is of course best left to each individual.
Read more about doorway pages: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355
Consider how Yelp has 100s of pages about dentist, at least one page for every major city in America. Although the pages are similar, they are each filled with unique content and all have unique links pointing to them. Each delivers a similar message, but provides unique value based on that particular location.
Add unique value to each location specific page, and you're doing great.
Sometimes you do want to get links removed if you suspect they are hurting your site. Richard Baxter wrote a great article today about how to determine the quality of your backlinks.
https://seogadget.co.uk/bad-backlink-checking/
Unless you've suffered a penalty, it's usually easier just to build up more good links than worry about the bad.
As for SEO SPYglass, to be completely honest, I've tried the free version of SEO SpyGlass, and although I can't say anything negative about the service, neither did I feel compelled to upgrade to a paid version.
That said, I appreciate Olga's comments. I'm not here to promote SEOmoz or provide product comparisons to a competitor. As an SEOmoz employee, We're both certainly biased, and truth be told, I've heard others say good things about SEO SpyGlass. (but of course, I've heard a lot of people say great things about SEOmoz.
You should try out both the SEO SpyGlass free version, and sign up for the 30 free trial of SEOmoz, and see what you like best.
Hope this helps. Best of luck with your SEO.
My opinion is that the URLs are less important than the actual content on the page, including title tags, headlines, body text, etc.
Unfortunately, there's no way to manually determine when you've crossed the line of "near-duplicate" content, so it's best to make each content experience serve a unique experience targeted towards a specific purpose.
Hi Zach,
Good question. In the old days, this was referred to as "PageRank scuplting" - a process where you would place the nofollow tag on your unimportant links in order to pass more value to the links you wanted to count.
Then, in 2009, Google supposedly "plugged" the nofollow hole. Rand and team did a classic Whiteboard Friday on the topic. I still remember watching it the day it came out:
http://www.seomoz.org/blog/whiteboard-friday-how-do-we-plug-the-nofollow-leak
Today, you don't hear much about PageRank sculpting. Most SEOs don't bother with it, partly because of it's decreased effectiveness, but also in part because there are more effective ways of controlling the influence of links.
This is where the issue becomes more complex. Link "equity" or PageRank, (or MozRank), is only one small factor in the overall value of a link. Anchor text, position on the page, and a host of other factors all influence how much influence any given link can wield. Here's a good introduction on the subject (again from Rand)
Hope this helps! Best of luck with your SEO.
Hi Paul,
Linkscape is updated every few weeks, and with each new index old links are cleaned out. Occassionally you can find links that no longer exist, but Linkscape will look for these links during it's next crawl of the web, and only include them if found. So any non-existent links tend to disappear in a few weeks.
This is actually a very interesting problem for web crawlers, because 50% of the entire web disappears every year, and 80% disappears ever two years! Keeping up with this churn is a momentous task. In fact, over half the time it takes to publish a new Linkscape index is spent on processing time to ensure it's accuracy.
As Ari mentioned you can try Majestic SEO. You can also find some (highly inconsistent) backlink information in Webmaster Tools. No two link reports from different sources will ever be the same so you will definitely see different results. That said, if I had to choose one source for backlink information I prefer the numerous advantages offered by OSE.
Occasionally OSE reports links that are "hidden" or otherwise hard to find. In extreme cases it's best to check the source code of the website to ensure the links are actually missing.
Also, feel free to report any possible OSE errors to the SEOmoz help team (help@seomoz.org) Any information can help make the tool better.
Best of Luck with your SEO!
Hi Eada,
There's 2 things you want to do here to "help" Google re-crawl your site.
1. Submit a new sitemap via Google Webmaster Tools. Make sure the sitemap is up to date. If your sitemap hasn't changed, resubmit it.
2. Do a "Fetch as Googlebot" - also in Webmaster Tools. Fetch your homepage. Then chose the option that says "Submit to index and all linked pages"
An engineer at Google suggested #2 to me after my own site was hacked and I cleaned it up. It's a signal to Google to take another look at those pages.
Hi Mike,
First off, if you have a couple hours to spare, I'd start by watching a couple of videos. They are over a year old, but still contain tons of good knowledge about using Moz tools
http://www.seomoz.org/dp/pro-webinar-november-2010-with-danny-dover
http://www.seomoz.org/dp/webinar-february-seomoz-pro-overview
Yes, fixing errors is often considered "low-hanging-fruit" and can sometimes have a quick effect on improving your rankings - but not always, unfortunately.
For beginners, the On-page optimization tool is one of my favorites, because it contains so much knowlegde in a single tool. Pick a keyword, URL, and get to work.
Hope this helps! Best of luck with your SEO.
Turns out I wrote a post that expanded on this idea of keyword themes: https://moz.com/blog/keywords-to-concepts
Hope that helps! Best of luck with your SEO.
Hi Ken,
I'd actually worry less about the number of links on the page and try to address the duplicate content/title tag issues.
In reality, it's very common and sometimes necessary for websites, especially eCommerce websites, to have more than 100/links per page. See this post from Dr. Pete.
Hi Matthew,
You definitely want your pages to resolve to one version or another, either http or https. Don't leave it for the search engines to sort it out.
For instance, take a look at Paypal, which redirects every single URL to https.
Google and all major search crawlers can now handle https with ease, but if you place your content on 2 different URLs, this can count as dupe content.
If the https pages actually redirect (via 301) to http, there is no issue of cloaking.
Does that help?
Hi Jimmy,
Thanks for the really fun question (note: negative SEO isn't fun, but trying to figure it the algorithm is
Couple of reasons why I think this would be difficult:
1. We have very limited working knowledge of both co-citation and co-occurance. What we do know at this point is little more than theory. So working them into an actionable strategy for positive rankings would be hard enough I imagine, let alone negative SEO.
2. The signals produced by these measurements are likely to be weaker than traditional link signals, thus reducing the incentive to use them.
3. One of the reasons we believe search engines may use co-citation and co-occurance is that they are harder to game (especially when combined with authority and trust metrics) so it follows that they would also be harder to game in the negative.
That said, it's so new I barely know what I'm talking about. Really interesting area
In my view, it's not a big problem to have these sites on the same server, as long as each site has unique content, and there isn't a huge amount of inter-linking between the sites. (in which case, 30 sites would sound a lot like a link network to me, which would risk all of your sites being de-indexed)
What is potentially a problem is making sure each site is truly unique. This means having unique content (including navigation and in some cases, unique templates - although some would argue this is a bit extreme) You say that the sites employ the same link building methods - does this mean they all have links from the exact same sources? Again, you could be stumbling into link network territory again... or Google may choose to place less value on those links.
It's a murky area, and you'll have to use your own analytics and judgement to tease the answers out of Google. I'd say your not in trouble just because of the shared IP, but my suspicion tells me your in a dangerous neighborhood.
Hope this helps. Best of luck with your SEO!
These are the ones I recommend:
This question is full of mystery. If you "nofollow" an outbound link, Google won't count it, and this is the safe thing to do for untrusted links 99% of the time.
But why would you not want Google to "see" the link? There is a technique called "masking" or "cloaking" in which you redirect the link through a blocked directory (via robots.txt), then use php or another method to send the link on it's merry way.
This is often done with affiliate links. Do a Google search for how to mask (or cloak) affiliate links and you should find several techniques to do what you're looking for.