It's hard to tell without seeing your copy, but it sounds like you're talking about internal links within your own site. Google is talking about outbound links. Internal anchor text (linking within your site) is rarely penalized.
Posts made by Dr-Pete
-
RE: Lost 50% google traffic in one day - panic?
-
RE: Lost 50% google traffic in one day - panic?
Just want to let people know that I believe this is just an overall quality control form. It's worth a try, but I don't expect they'll take action on individual sites (I could be wrong, but don't get your hopes up, in other words).
-
RE: Can you 404 any forms of URL?
I would either go with parameter blocking or META NOINDEX this page (that's probably a bit more effective). It would be better to block it from Google than to 404 100s of variants, as you could see a spike in 404s and that can cause some problems.
Sorry, edited this - you don't want to 404 the login pages, because that's going to return a 404 for visitors as well, and the pages won't function properly. You want to just keep this away from the bots.
-
RE: How to prevent duplicate content within this complex website?
Yeah, time-sensitive information is always tough. I think you're dead on about the disadvantages - the timing of Google's application of these rotating tags would always be off, and you could end up with some really weird search results that are not only bad for SEO but could create bad UX (people landing on old pages thinking they're new).
What about another option - could you take more of a news/blog approach and have a "/current" page that is always the current week? As the current week changes, roll that content into an archive page ("/week8", etc.). That way, the content lives on, but the current URL never changes.
In terms of duplication, is this really full duplication? It sounds like some pages (like the season) just have snippets of the current week. That's not necessarily a problem. If they are very similar, could you "widgetize" it somehow? Could be straight HTML, but use a condensed format for the season page that links to the full version on the current week page. This would be much like a snippet of a blog post - instead of repeating everything on all 3 pages, have one main chunk of content and two summaries.
-
RE: Duplicate page errors
Odds are that this isn't causing you major problems yet, but if the variants are indexed by Google (you can check with the "site:" operator), I would use the META robots tag. Robots.txt is good for prevention, but it's not great at clearing pages out of the index, in my experience.
-
RE: Should I Wait Until the "Dust Settles" on the Algorithm Update or Get Busy Now?
I'd echo Rand but will just add this. You've got two factors here:
(1) On the one hand, the "Penguin" update hit hard and it isn't going to go away, I expect. Google may make minor adjustments, but these could happen over weeks and months and the philosophy is here to stay. Keep in mind that we're over a year into Panda update snow. So, absolutely don't wait for the dust to settle.
(2) On the other hand, you're dealing with an existing penalty, and you do need to resolve that. Don't panic, in other words. Act decisively, but follow through on solving the current problem.
If you go changing things site-wide, like titles/URLs, it's going to be hard to get a clean read of the data. If you can, tackle the worst culprits proactively (a spammy home-page title, for example). Tackle anything that's obviously a win - dupe content is a great example - you know it isn't helping you. From a link-building perspective, diversify and show a positive forward path.
-
RE: Remove Unatural Links
While I do think you can trade off positive link-building vs. cutting negative links (depending on the severity), I have to disagree with "the warning will still be there no matter what you do". We're seeing accounts from reputable firms that this warning is real, can be followed by a penalty, and will be lifted if action is taken.
The trick is that the warning covers many levels of severity, and there's no good way to tell how severe this one is. If you know you have very questionable links, I'd kill the worst of them, personally. This would include paid links or very obvious, low-value spam.
I'd also tell your SEO company to shift gears ASAP. As Ivaylo said, diverify - no matter whether you cut links or don't.
-
RE: Lost 50% google traffic in one day - panic?
The so-called "over-optimization" penalty definitely hit hard, sometime between 4/24-4/25. It is possible to see Google make a correction, but it doesn't appear that there were major changes between 4/25-4/26. Of course, this just happened, so we don't have much data.
Given the large scale of the drop and that this is a confirmed algorithm change, I think you have to start there. This is the official post from Google and I see that someone has already linked to Rand's post:
http://googlewebmastercentral.blogspot.com/2012/04/another-step-to-reward-high-quality.html
Look specifically at the terms that dropped. Are you aggressively targeting those terms (with keyword stuff, inbound anchor text, etc.)? I think the "Don't Panic" advice has some merit - if you over-optimize to reverse over-optimize, you've just made even more of a mess. I'm not saying to sit on your hands, but dig deep into the analysis and put together a reasonable plan of attack - don't just start changing things at random.
-
RE: Avoid Keyword Self-Cannibalization
Are you saying you created separate campaigns for both the "www" and non-www versions of the site here on SEOmoz (or somewhere else, like Google Webmaster Tools)? I guess the first question would be if these look like duplicates - not sure why you're reporting on them separately.
-
RE: Too many links - How to address without removing them?
The short answer is "no". There's no safe way (i.e. that doesn't look like cloaking) to NOINDEX just a few links, and nofollow doesn't work for internal PR-sculpting anymore.
When you're talking about something like an archive - a long list of resources - I wouldn't get too hung up on it. It's really an issue of balance. If every page on your site has 200 navigation links, you're spreading yourself really thin. If one page of your site references 100 blog posts in a list, that's pretty natural. You could paginate that list and use rel=prev/next or something like that, but there's always a trade-off (the first page would probably pass more PR and the later pages less).
The other option would be to have two archives - A Top 25/50/etc. that has the posts you most want to pass internal PR to, and then link that to an archive with everything. That would give the main posts more prominence.
Always a bit hard to say without seeing the site/page, but in this case, I don't think I'd lose sleep over it.
-
RE: How do fix twin home pages
It could just be a time-lag in our data (and that wouldn't shock me), but run a header checker and make sure the 301 is working properly. For example, try this:
-
RE: Concerned
Argh - I'm sorry, yes. The hreflang="" code is the same, but the URL is the cross-language version of that URL. As long as the URL structure stays the same, this shouldn't be too hard, but if you use different structures, it could be a pain. I'm editing my previous reply.
-
RE: Concerned
No - I'll be perfectly honest: I don't do a ton of international. The international SEOs I trust seem to think positively about the new tags, but we don't have a ton of data. The upside is that they're relatively easy to implement and they don't carry any real risk. The worst that happens is that it doesn't work.
My gut reaction is that there's regional confusion and Google is having a tough time reconciling duplicates. That's more in line with the inconsistent ranking you describe than a full-blown penalty would be.
-
RE: Concerned
The two sites should point at each other and use the region codes, so...
(1) The English site should have this tag:
(2) The Irish site should have this tag:
That way, whichever site Google hits, they're aware of the other site(s).
-
RE: Duplicate Page Content
Oh, sorry - didn't catch that some were duplicated. Given the scope, I think I'd put the time into creating unique titles and single-paragraph descriptions. There's a fair shot these pages could rank for longer-tail terms, and the content certainly has value to visitors.
-
RE: Concerned
This can get tricky - rel-canonical passes link juice, but it could also prevent the .ie pages from ranking. Google is a bit inconsistent with this internationally, sometimes, a non-canonical version will still rank, if it's more relevant to the country/language of the query, but I'd hate to trust that.
-
RE: Concerned
Unfortunately, while you should be able to theoretically target .co.uk and .ie separately, Google can screw it up on occasion and treat them as duplicates. If you're seeing the copy bring up the .ie site on Google.co.uk, that's definitely a possibility. You could try the new hreflang approach - see this Google resource:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
It's basically for regional content where the language is the same (there are other variants, but that's a big one), since Google knows they don't always get it right.
It is also possible that the .co.uk page has been penalized and other content is just being brought in to fill the spot - since the PDF is at #68, that's also possible. Have you done any recent link-building pushes to this particular page?
-
RE: Duplicate Page Content
I'm wondering if we're looking at two different things - I was looking at the pages like:
http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-8.html
http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
These already seem to have unique titles.
-
RE: Can Location Information Decrease National Search Volume ?
Miriam has a much better head for local than I do, but my sense of recent local updates (like "Venice") is the same as hers. If Google starts to treat a SERP as local, then being seen as local should help you and being seen as may hurt you. On average, I think we're seeing more SERPs being treated as having local intent, so generally getting Google to locate you should be a positive. Of course, for any given keyword, there could be exceptions (especially if you get a huge amount of traffic from a few broad, "head" terms).
Hosting issues can always cause short-term problems, and there have been a lot of shake-ups in the algorithm recently. Google is probably testing over-optimization updates (not full roll-outs, but we're seeing weird things happening), and they had a glitch on 4/17 that shook up some sites. They've also been hitting link networks hard, but I find it hard to believe that a handful of reputable local directories would be impacted by that. If you built up 100s of low-quality directory links and doubled your link profile in the process, that could be trouble. Adding a couple of directories to a solid profile should pose no risk.
My gut reaction is that there's more going on here than just local factors.
-
RE: Duplicate Page Content
This isn't a typical application of rel=prev/next, and I'm finding Google's treatment of those tags is inconsistent, but the logic of what you're doing makes sense, and the tags seem to be properly implemented. Google is showing all of the pages indexed, but rel=prev/next doesn't generally de-index paginated content (like a canonical tag can).
Where is GWT showing them as duplicates (i.e. title, META description, etc.)?
Long-term, there are two viable solutions:
(1) Only index the main gallery (NOINDEX the rest). This will focus your ranking power, but you'll lose long-tail content.
(2) Put in the time to write at least a paragraph for each gallery page. It'll take some time, but it's doable.
Given the scope (you're talking dozens of pages, not 1000s), I'd lean toward (2). These pages are somewhat unique and do potentially have value, but you need to translate more of that uniqueness into copy Google can index.
-
RE: Guest posts/article marketing can be considered as paid posts by SEs?
Guest posts/articles wouldn't generally be considered paid, but they can create problems if they're obviously low quality or if you're spinning the same articles across dozens of sites. It really depends on a lot of factors:
(1) How much you use this tactic. No single tactic like this, especially if low-quality, should be the bulk of your link-building. Diversity is very important ("natural" link profiles tend to be diverse).
(2) If the sites are part of a link network. There's been a big crackdown lately on networks, and many article marketing services use them. If you're buying into a network or service, it's a lot more likely. If you're finding places to guest post manually, it's probably not a big risk.
(3) If the post/articles are clearly spammy. Use your judgment - if you look at the blogs your articles are posted on, and there are 20 other articles all on unrelated topics in spammy verticals (mortgages or pay-day loans, for example), it's going to be easy for Google to spot your quality issues. You may not get penalized, but the links will be devalued.
-
RE: How do fix twin home pages
Yeah, it sounds like you're not currently having major issues. I think it's good to prevent these issues (and duplicates are a real concern), but you can ease into this one, I strongly suspect.
-
RE: How do you incorporate a Wordpress blog onto an ecommerce website?
If it's done correctly and Google sees it as if it lives in the subfolder, then yes - it's perfectly fine for SEO. This is a technically tricky solution, though, and would really depend on the capabilities of your hosting provider.
-
RE: How do fix twin home pages
Since this issue can occur site-wide, I do tend to agree with Anton that 301-redirects are a better solution for this particular problem (although canonical tags will work, if that's your only feasible option). It is important, as implied in the comments, to make sure hat your internal links are consistent and you aren't using both versions in your site (although, with "www" vs. non-www, that's pretty rare).
Practically, it depends a lot on the size of your site, whether you have links to both versions, and whether Google has indexed both version. This is a problem in theory, but it may not currently be a problem on your site. You can check the indexed pages of both the root domain and www subdomain separately in Google with these commands:
site:mysite.com inurl:www
site:mysite.com -inurl:www
(the first pulls up anything with "www", and the second only pages without it).
If you're seeing both in play, then sorting out how to do the 301-redirects is a good bet. If you're not, then it's still a solid preventive measure, but you don't need to panic.
-
RE: How do you incorporate a Wordpress blog onto an ecommerce website?
I can't vouch for these tactic, but there are ways to port WordPress to .Net. For example:
http://www.php-compiler.net/blog/2011/wordpress-on-net-4-0
http://sourceforge.net/projects/wordpressnet/
It might be better to go with a .Net-native app, but it's not completely impossible to run WordPress.
Can they set up a reverse proxy? You could theoretically run the current WordPress blog on a separate server, but then make it look like it "lives" on a subdomain or subfolder. It's a bit tricky, but it's possible.