Google considers this to be spam. Sometimes pages get away with doing this, but generally you're going to eventually get a manual action reported in Search Console.
- Home
- MichaelC-15022
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
MichaelC-15022
@MichaelC-15022
Job Title: SEO Consultant
Company: OzTech, Inc.
Founder of Visual Itineraries, a closing tool/lead generation tool for niche and high-end travel agents. Co-founder of TheBigDay honeymoon registry; avid traveler, photographer, and still plays with cars and motorcycles. As of 2010, plays with airplanes too :-). Living the life in Bend,Oregon.
Favorite Thing about SEO
Truly interesting and lively people in this industry.
Latest posts made by MichaelC-15022
-
RE: Google Rich Snippets in E-commerce Category Pagesposted in Intermediate & Advanced SEO
-
RE: Google Rich Snippets in E-commerce Category Pagesposted in Intermediate & Advanced SEO
I generally recommend putting basic Product markup (name, price, maybe image, URL pointing to the single product page) at that level. The idea here is to let Google understand that that page contains a big list of products that fit the category as seen in the page title.
DO NOT put reviews at this level--I saw something from Google recently that says they consider that to be a spammy attempt to get ratings snippets in the results for that page. Put the reviews only at the single product page level.
-
RE: How long should the Image Alt Text be for SEO?posted in Image & Video Optimization
There isn't really any limit, like there is for page titles, meta descriptions, etc. Typically you'll want the ALT text to explain what's in the image--the original purpose was to show the user what the image was before it was downloaded, and also for vision impaired folks, the screen readers would read to them what was in the image by reading the ALT text.
If you're looking for the image to reinforce the relevance of the page for the page's target topic, then make sure that the topic term is in the ALT text, usually as part of a long phrase or sentence. If you're looking for the image to rank well in Google image search, then I'd keep the ALT text to just what the target term is (and of course make sure the page title reflects that term as well).
-
RE: SEO Impact of High Volume Vertical and Horizontal Internal Linkingposted in Intermediate & Advanced SEO
No, keep doing it the way you're doing it. That's perfectly good link juice flowing between those pages.
Breadcrumbs are a nice way to communicate the hierarchy to Google--not because they're breadcrumbs, but simply because of their nature: all pages at each level contribute link juice back up to each of its ancestor pages. A child page has the least internal links; its parent has more; its grandparent even more; etc.
-
RE: Hreflang and paginated pageposted in Intermediate & Advanced SEO
I found no examples, sorry...
I don't understand your comment about rel=canonical. There should be ONLY ONE rel=canonical, and it should reference its own page, EXCEPT in the rare case I outlined above where the content on two different country pages is essentially identical.
-
RE: Hreflang and paginated pageposted in Intermediate & Advanced SEO
Separate the language markup issue from the pagination issue, and treat each of the paginated pages just like any other page on the site.
You should have an hreflang statement for EVERY language page you support for each page in the pagination sequence, including the current page. So, for example, if we're looking at Italian page 17 of your Purple Widgets category, it should have an hreflang for the Italian page 17, as well as for the English page 17, French page 17, etc.
Rel=next and rel=previous should refer to the page from the same language as the page you're in, i.e. on Italian page 17, rel=prev should point to Italian page 16, and rel=next should point to Italian page 18.
I'm presuming, of course, that the content in the paginated pages is roughly equivalent, i.e. if it's a set of pages of purple widgets that you sort them the same way on the Italian version as the French, etc. But really, if you didn't....I'd still probably do it the same way.
Don't forget to set the rel=canonicals as well. Unless you're looking at two pages with the same language and content but targeting different countries (e.g. Portugal and Brazil, with no pricing info on the pages...in that case, you might rel=canonical both the Portuguese and Brazilian pages to one of those), each page will rel=canonical to itself.
-
RE: Ever seen this tactic when trying to get rid of bad backlinks?posted in White Hat / Black Hat SEO
I don't think there is any tactic happening. They simply are building lots of mini websites for their clients and messed up on no following affiliate links. it appears that they have not done any of the basic SEO audit work on their system. Nothing deliberate here IMHO.
-
RE: Ever seen this tactic when trying to get rid of bad backlinks?posted in White Hat / Black Hat SEO
So if I understand you correctly, your client who's penalized is www.starmandstravel.com, and you're seeing in their GWT backlinks list a ton of links from content.onlineagency.com/index.aspx?site=6599 and one or both of the other parameters are varying, right?
So then your question is: where are the onlineagency pages linked from?
-
RE: How many product subcategories are ok?posted in On-Page Optimization
2 years back....that's going to be right in the middle of a lot of Google algo changes, both Panda and Penguin. It might be just a coincidence that your traffic drop was around when the hacking happened.
Have you done a backlink analysis? It could be that the hackers also planted a ton of crappy links in the hopes of short-term getting your site to rank well for whatever they're trying to inject into your site and sell.
Sounds like you're doing the right thing with the URLs though. To do a double-check on the duplicate content side, you could run Screaming Frog against it, and check for duplicate page titles and meta descriptions. If your pages are getting spit out with multiple URLs, you'll see them all show up with duplicates.
-
RE: Organic search traffic dropped 40% - what am I missing?posted in Intermediate & Advanced SEO
Did the layout of the header area change significantly? If, for instance, the header area went from 1/10th of the "above the fold" area to 1/3rd, that might run the entire site afoul of the "topheavy" part of Panda.
Best posts made by MichaelC-15022
-
RE: How does a collapsed section affect on page SEO?posted in On-Page Optimization
Hi Stephan,
Presuming the expand/collapse thing is done properly, it should be golden. You'll find a lot of sites use this approach when they have multiple pages of content, e.g. a product page with specifications, reviews, technical details, etc.
I do this on my travel website. A great way to test to see if the initially-collapsed content is being seen and indexed by Google is to take a block of text from the collapsed section and search for it in double-quotes.
Here's an example: search for "At the Bora Bora Pearl Beach Resort you can discover the sparkling magic of the lagoon". You'll find my site there at #3 (Visual Itineraries), along with the other 1000 websites who've also copied the resort's description straight from the resort's website (yeah, I really shouldn't do this). So much for Google's duplicate content detection when it comes to text chunks...BUT I DIGRESS. That content you see is on the More Info tab.
Now, on to what "done properly" means:
- each tab should be in a separate div
- assign all divs a class which has style="display:none;" EXCEPT the currently selected tab
- have onclick handlers for the tabs that set all of the divs' classes to the display:none class, and then set the newly selected tab's div class to one with display:block or display:inline
And not done properly would mean something like changing the text of a div with Javascript onclick()....because Google won't see that text in the Javascript. It's got to be in the HTML.
That's about it. Not so tricky, really. And works well both for usability (no roundtrip to the server, not even an Ajax fetch!) and for SEO (lotsa yummy content on a single page for Panda).
-
RE: Organic search traffic dropped 40% - what am I missing?posted in Intermediate & Advanced SEO
Possibilities:
- The layout of the product pages for the new shopping cart is pissing off Panda. If that's the case, the traffic to the home page shouldn't have changed much, but the product pages will have dropped.
- Panda now sees the pages in general as having less content than before, perhaps images aren't getting loaded in the pages in such a way that Google sees them whereas they were before, something like that....and Panda now thinks the entire site is less rich in content.
- It often seems to take Google a month or so to "settle out" all of the link juice flows when you do a bunch of redirects, have new URLs, etc. I would expect that the link juice calculation is iterative, and that would be why it would take a number of iterations of the PageRank calculation in order for entirely new URLs to "get" all the link juice they should have.
- Their backlinks were moderately dependent upon a set of link networks, and those link networks have shut down all their sites (so that neither Google nor Bing still see the links from them).
Those are the ideas that come to mind so far.
-
RE: SEO Impact of High Volume Vertical and Horizontal Internal Linkingposted in Intermediate & Advanced SEO
No, keep doing it the way you're doing it. That's perfectly good link juice flowing between those pages.
Breadcrumbs are a nice way to communicate the hierarchy to Google--not because they're breadcrumbs, but simply because of their nature: all pages at each level contribute link juice back up to each of its ancestor pages. A child page has the least internal links; its parent has more; its grandparent even more; etc.
-
RE: Ever seen this tactic when trying to get rid of bad backlinks?posted in White Hat / Black Hat SEO
I don't think there is any tactic happening. They simply are building lots of mini websites for their clients and messed up on no following affiliate links. it appears that they have not done any of the basic SEO audit work on their system. Nothing deliberate here IMHO.
-
RE: Should my canonical tags point to the category page or the filter result page?posted in Algorithm Updates
Hi Daniel,
You're going to have to walk a fine line between having a page for every possible combination of filtered results that a user might search for AND appearing to have a ton of pages that are really almost identical....and suffering the wrath of Panda upon seeing what it thinks is duplicate content.
The easy way out is to have 1 page for each category, and no matter what filters are applied, rel=canonical to that category. Dupe content problem solved.
So why isn't this the ideal solution?
#1 You may be missing out on targeting combinations of categories and filters that users will commonly search for. Let's say you were selling clothing, and a category was shirts, and you had a filter for men/women/boys/girls. By making all shirts list pages rel=canonical to the overall shirts list page (with no filters), you'd be missing an opportunity to target "boys shirts".
#2 You may be missing opportunities to pour more link juice to the individual product pages. It's unclear (to me, anyway) whether Google adds the link juice from all pages rel=canonical'ed to a page, or whether Google simply treats rel=canonical as "oh ya, I've already seen & dealt with this page". Certainly in my testing I've seen places where pages rel=canonical'ed to another page actually still show up in the search results, so I'd say rel=canonical isn't as solid as a 301.
So what do you do? I'd recommend a mix. Figure out what combinations you think you can get search traffic from, and find a way to break down the complete set of combinations of filters and categories to target those, and to rel=canonical every page to one of your targeted pages.
It's entirely possible (likely, even) that you'll end up with a mix. For instance, going back to my earlier example, let's say you had another filter that was, let's say, price range. You might want to target "boys shirts", but not "boys shirts under $20". So, while "boys" was a filter value, and "under $20" was a filter value, you might rel=canonical all pages in the category "boys" with a filter value of "shirts" to your page that has just that category and that 1 filter set, regardless of setting of the price filter.
Clear as monkey poop?
-
RE: Should my canonical tags point to the category page or the filter result page?posted in Algorithm Updates
I agree, that's a great approach. I think you mean Javascript, not Java though (that's a different language). The only thing that might make this approach a challenge would be if you had so much product data before filtering that it caused a performance problem, i.e. let's say you had 50 pages of results...if you filter server-side, you're only sending down 1 page of results, whereas if you're filtering with client-side Javascript, you've got to send all 50 pages down and then filter it in the browser.
-
RE: How many product subcategories are ok?posted in On-Page Optimization
2 years back....that's going to be right in the middle of a lot of Google algo changes, both Panda and Penguin. It might be just a coincidence that your traffic drop was around when the hacking happened.
Have you done a backlink analysis? It could be that the hackers also planted a ton of crappy links in the hopes of short-term getting your site to rank well for whatever they're trying to inject into your site and sell.
Sounds like you're doing the right thing with the URLs though. To do a double-check on the duplicate content side, you could run Screaming Frog against it, and check for duplicate page titles and meta descriptions. If your pages are getting spit out with multiple URLs, you'll see them all show up with duplicates.
-
RE: How to optimize SEO value of links in a calendarposted in Web Design
Hi Chris,
With any calendar plugin or widget you're looking at, you can use the Moz toolbar's highlighter to instantly show you whether the links in the calendar, or event descriptions, are followed. That's the first thing to check out (I don't know if Google Calendar links are or not).
Something else to look at with calendaring is that apparently it's pretty easy to get rich snippets and nice placement in the SERPs if you mark up the events with schema.org/Event markup. You'd probably want to put that markup on the individual event pages, of course, and NOT in the calendar itself.
Third consideration: you'll want to be very careful about how many clicks it is from the home page to the individual event pages, and to try to minimize that. For instance, let's say you care about ranking for the name of a big event to be held in August 2014 now....and, to get to that event from your home page, you had to click:
- Events (then it shows March 2014 calendar)
- next month
- next month
- next month
- next month
- next month
- [name of event]
That'll flow piddly amounts of link juice to your August event's page, and your rankings will really suck.
You might address that by adding links to the next 12 or 24 months calendar pages from the main Events page...or, even better, if Events is a main nav pulldown, have each of the next 12 or 24 months in the pulldown, so that the August 2014 calendar page gets a link from every single page on your site.
Internal link juice flow can make a massive difference to rankings, indexation, etc. so pay attention to this stuff

Cheers,
Michael
-
RE: Best & easiest way to 301 redirect on IISposted in Technical SEO
You have two options:
- Set it up in IIS Manager (best option, least overhead for the server, need no coding skills)
- Code it in classic ASP in a global include file that all pages reference before sending content back to the browser.
Here's a great article that walks you through the IIS config option. For this, you need access to IIS Manager:
Sounds like that option is unavailable to you however.
For the other option: your site probably has a file or two that's included at the start of all web pages. (If not, you can add it). In that file, you'll want to check the URL passed in like this:
Dim sThisPage = Request.ServerVariables("SCRIPT_NAME")
If (LCase(sThisPage) = "/oldpage.aspx") Then
Response.Status = "301 Moved Permanently"
Response.AddHeader "Location", "http://" & sThisServer & "/newpage.aspx"
Response.End
End If -
RE: 1000 of links on my website ? is it good or badposted in Link Building
You've definitely got a lot of links in there in your navigation, but I don't think it's really going to hurt you. It's doing to distribute link juice pretty evenly across all of the pages linked to from that left-hand menu and the submenus.
You MIGHT consider reducing the left menu to just the main categories, and NOT including all of the submenus. That would funnel a fair bit more link juice to the main categories, and the submenus I'm guessing are all pretty long-tail anyway, so they shouldn't need as much.
I don't see a duplicate content problem here. I'm seeing different content on every page. You might consider putting noindex/follow on some of the intermediate pages that merely list subcategories, like this one:
http://dorchdanola-netbutik.dk/category/belysning-el-artikler-485/
These kinds of pages will be seen as extremely content-light...and that's not so good.
I wouldn't nofollow your social media links. And rel=me mostly you use to link an author page back to your G+ profile, so I wouldn't use that here.
Founder of Visual Itineraries, a closing tool/lead generation tool for niche and high-end travel agents. Co-founder of TheBigDay honeymoon registry; avid traveler, photographer, and still plays with cars and motorcycles. As of 2010, plays with airplanes too :-). Living the life in Bend,Oregon.
Looks like your connection to Moz was lost, please wait while we try to reconnect.