I don't think there is any tactic happening. They simply are building lots of mini websites for their clients and messed up on no following affiliate links. it appears that they have not done any of the basic SEO audit work on their system. Nothing deliberate here IMHO.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by MichaelC-15022
-
RE: Ever seen this tactic when trying to get rid of bad backlinks?
-
RE: Ever seen this tactic when trying to get rid of bad backlinks?
So if I understand you correctly, your client who's penalized is www.starmandstravel.com, and you're seeing in their GWT backlinks list a ton of links from content.onlineagency.com/index.aspx?site=6599 and one or both of the other parameters are varying, right?
So then your question is: where are the onlineagency pages linked from?
-
RE: How many product subcategories are ok?
2 years back....that's going to be right in the middle of a lot of Google algo changes, both Panda and Penguin. It might be just a coincidence that your traffic drop was around when the hacking happened.
Have you done a backlink analysis? It could be that the hackers also planted a ton of crappy links in the hopes of short-term getting your site to rank well for whatever they're trying to inject into your site and sell.
Sounds like you're doing the right thing with the URLs though. To do a double-check on the duplicate content side, you could run Screaming Frog against it, and check for duplicate page titles and meta descriptions. If your pages are getting spit out with multiple URLs, you'll see them all show up with duplicates.
-
RE: Organic search traffic dropped 40% - what am I missing?
Did the layout of the header area change significantly? If, for instance, the header area went from 1/10th of the "above the fold" area to 1/3rd, that might run the entire site afoul of the "topheavy" part of Panda.
-
RE: How does a collapsed section affect on page SEO?
Hi Stephan,
Presuming the expand/collapse thing is done properly, it should be golden. You'll find a lot of sites use this approach when they have multiple pages of content, e.g. a product page with specifications, reviews, technical details, etc.
I do this on my travel website. A great way to test to see if the initially-collapsed content is being seen and indexed by Google is to take a block of text from the collapsed section and search for it in double-quotes.
Here's an example: search for "At the Bora Bora Pearl Beach Resort you can discover the sparkling magic of the lagoon". You'll find my site there at #3 (Visual Itineraries), along with the other 1000 websites who've also copied the resort's description straight from the resort's website (yeah, I really shouldn't do this). So much for Google's duplicate content detection when it comes to text chunks...BUT I DIGRESS. That content you see is on the More Info tab.
Now, on to what "done properly" means:
- each tab should be in a separate div
- assign all divs a class which has style="display:none;" EXCEPT the currently selected tab
- have onclick handlers for the tabs that set all of the divs' classes to the display:none class, and then set the newly selected tab's div class to one with display:block or display:inline
And not done properly would mean something like changing the text of a div with Javascript onclick()....because Google won't see that text in the Javascript. It's got to be in the HTML.
That's about it. Not so tricky, really. And works well both for usability (no roundtrip to the server, not even an Ajax fetch!) and for SEO (lotsa yummy content on a single page for Panda).
-
RE: Organic search traffic dropped 40% - what am I missing?
Possibilities:
- The layout of the product pages for the new shopping cart is pissing off Panda. If that's the case, the traffic to the home page shouldn't have changed much, but the product pages will have dropped.
- Panda now sees the pages in general as having less content than before, perhaps images aren't getting loaded in the pages in such a way that Google sees them whereas they were before, something like that....and Panda now thinks the entire site is less rich in content.
- It often seems to take Google a month or so to "settle out" all of the link juice flows when you do a bunch of redirects, have new URLs, etc. I would expect that the link juice calculation is iterative, and that would be why it would take a number of iterations of the PageRank calculation in order for entirely new URLs to "get" all the link juice they should have.
- Their backlinks were moderately dependent upon a set of link networks, and those link networks have shut down all their sites (so that neither Google nor Bing still see the links from them).
Those are the ideas that come to mind so far.
-
RE: How to optimize SEO value of links in a calendar
Hi Chris,
With any calendar plugin or widget you're looking at, you can use the Moz toolbar's highlighter to instantly show you whether the links in the calendar, or event descriptions, are followed. That's the first thing to check out (I don't know if Google Calendar links are or not).
Something else to look at with calendaring is that apparently it's pretty easy to get rich snippets and nice placement in the SERPs if you mark up the events with schema.org/Event markup. You'd probably want to put that markup on the individual event pages, of course, and NOT in the calendar itself.
Third consideration: you'll want to be very careful about how many clicks it is from the home page to the individual event pages, and to try to minimize that. For instance, let's say you care about ranking for the name of a big event to be held in August 2014 now....and, to get to that event from your home page, you had to click:
- Events (then it shows March 2014 calendar)
- next month
- next month
- next month
- next month
- next month
- [name of event]
That'll flow piddly amounts of link juice to your August event's page, and your rankings will really suck.
You might address that by adding links to the next 12 or 24 months calendar pages from the main Events page...or, even better, if Events is a main nav pulldown, have each of the next 12 or 24 months in the pulldown, so that the August 2014 calendar page gets a link from every single page on your site.
Internal link juice flow can make a massive difference to rankings, indexation, etc. so pay attention to this stuff
Cheers,
Michael
-
RE: Best & easiest way to 301 redirect on IIS
You have two options:
- Set it up in IIS Manager (best option, least overhead for the server, need no coding skills)
- Code it in classic ASP in a global include file that all pages reference before sending content back to the browser.
Here's a great article that walks you through the IIS config option. For this, you need access to IIS Manager:
Sounds like that option is unavailable to you however.
For the other option: your site probably has a file or two that's included at the start of all web pages. (If not, you can add it). In that file, you'll want to check the URL passed in like this:
Dim sThisPage = Request.ServerVariables("SCRIPT_NAME")
If (LCase(sThisPage) = "/oldpage.aspx") Then
Response.Status = "301 Moved Permanently"
Response.AddHeader "Location", "http://" & sThisServer & "/newpage.aspx"
Response.End
End If -
RE: 1000 of links on my website ? is it good or bad
Hi Anders,
I would expect that Google would see and understand the jquery menu links. They've been able to find links in simple Jscript for several years now, and certainly jquery is one of the most common libraries out there, so I'd be surprised if they're not already all over that.
MC
-
RE: 1000 of links on my website ? is it good or bad
You've definitely got a lot of links in there in your navigation, but I don't think it's really going to hurt you. It's doing to distribute link juice pretty evenly across all of the pages linked to from that left-hand menu and the submenus.
You MIGHT consider reducing the left menu to just the main categories, and NOT including all of the submenus. That would funnel a fair bit more link juice to the main categories, and the submenus I'm guessing are all pretty long-tail anyway, so they shouldn't need as much.
I don't see a duplicate content problem here. I'm seeing different content on every page. You might consider putting noindex/follow on some of the intermediate pages that merely list subcategories, like this one:
http://dorchdanola-netbutik.dk/category/belysning-el-artikler-485/
These kinds of pages will be seen as extremely content-light...and that's not so good.
I wouldn't nofollow your social media links. And rel=me mostly you use to link an author page back to your G+ profile, so I wouldn't use that here.
-
RE: Should my canonical tags point to the category page or the filter result page?
I agree, that's a great approach. I think you mean Javascript, not Java though (that's a different language). The only thing that might make this approach a challenge would be if you had so much product data before filtering that it caused a performance problem, i.e. let's say you had 50 pages of results...if you filter server-side, you're only sending down 1 page of results, whereas if you're filtering with client-side Javascript, you've got to send all 50 pages down and then filter it in the browser.
-
RE: When should you redirect a domain completely?
I've been seeing a fairly long time for Google to catch up with 301s and pass along the link juice....a couple of months. Consensus is that something around 90-95% of the link juice passes through a 301. There's some talk about 301s not passing link juice forever, although I haven't heard any evidence of exactly how long that might be, but seems like it would be in the 1-2 year range at least. Just my opinion & recollection of a whole bunch of other people's opinions though :-).