Do I have a Panda filter on a specific segment?
-
Our site gets a decent level of search traffic and doesn't have any site-wide penalty issues, but one of our sections looks like it might be under some form of filter. Unfortunately for us, they're our buy pages!
Check out http://www.carwow.co.uk/deals/Volkswagen/Golf it's unique content and I've built white hat links into it, including about 5 from university websites (.ac.uk domains DA70+). If you search something like "volkswagen golf deals" the pages on page 1 have weak thin content and pretty much no links.
That content section wasn't always unique, in fact the vast majority of it may well be classed as dupe content as there's no Trim data and they look like this: http://www.carwow.co.uk/deals/Fiat/Punto
While we never had much volume, the traffic on all /deals/ pages appears to drop significantly around the time of the May Panda update (4.0).
We're planning on completely re-launching these pages with a new design, unique trim content and a paragraph (c.200 words) about the model.
Am I right in assuming that there's a Panda filter on the /deals/ segment so regardless of what I do to one deals page it won't rank well, and we have to re-do the whole section?
-
It is still possible it isn't a penalty from one of the major algorithms and you may be able to solve this by creating a strong internal linking strategy. It helps to formulate one if you use something like MindNode to create an overview of the site and then you can drill in on the pages in questions.
It is possible that a noindex would cure this, but it all depends because even though you add a noindex tag to a page, Google can still read the page and apply the penalty. All it means is that the page won't be indexed.
However, if you are relaunching everything very soon, you might be as well to sit tight and not do anything too rash for a short-term solution.
-Andy
-
Positions as well.
Some form of filter is the only explanation I can think of for why that VW Golf Deals page doesn't perform. It's better content and has decent links (OSE hasn't picked them up but they're there).
We get c.40k hits/month on our blog and c.25k hits/month on our car-reviews, entirely through organic, but literally zero on the deals pages, where if anything the competition is less and the quality is lower.
I wonder if placing a no-index tag on the deals pages that have thin content would resolve the issue, but we'll be re-launching the whole segment in the coming weeks.
-
Hi James,
Am I right in assuming that there's a Panda filter on the /deals/ segment
Unfortunately there is no guaranteed way to say this is the case, but generally if you see a drop in traffic / positions that coincide with an algorithm refresh, then this can be telling.
Is it just traffic to those pages that has dropped, or positions in the SERPs?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang and canonical tag for new country specific website - different base domain
I have a little different situation compared to most other questions which asks for hreflang and canonical tags for country specific version of websites. This is an SEO related question and I was hoping to get some insight on your recommendations. We have an existing Australian website - say - ausnight.com.au now we want to launch a UK version of this website - the domain is - uknight.co.uk please note, we are not only changing from .com.au to .co.uk.... but the base domain name as well changed - from ausnight to uknight as you can understand, the audience for both websites is different. Both websites has most pages same with same contents.... the questions I have is - Should we put canonical tag on the new website pages? If we don't put canon tag on new website pages, what is the impact on the SEO ranking of current website? I believe we need to put hreflang tag on both websites to tell google that we have another language version (en-au vs en-gb) of the same page. Is this correct?
Intermediate & Advanced SEO | | TinoSharp0 -
Whats the best way to implement rel = “next/prev” if we have filters?
Hi everyone, The filtered view results in paginated content and has different urls: example: https://modli.co/dresses.html?category=45&price=13%2C71&size=25 Look at what it says in search engine land: http://searchengineland.com/implementing-pagination-attributes-correctly-for-google-114970 Look at Advanced Techniques paragraph. do you agree? it seem like google will index the page multiple times for every filter variant. Thanks, Yehoshua
Intermediate & Advanced SEO | | Yehoshua0 -
Htaccess rewrite rule (very specific)
Hello, Awhile back my company changed from http: to https: sitewide (before i started working here). We use a very standard rewrite rule that looks like this: RewriteEngine On
Intermediate & Advanced SEO | | Waismann
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://opiates.com/$1 [R,L] However, with this rule in place, some http: urls are being redirected with a 302 status code. My question is, can I safely change the above code to look like this: RewriteEngine On
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://opiates.com/$1 [R=301,L] to ensure that every redirected is returned with a 301 status code. The only change is in the [R,L] section. Thanks to whomever can help with this. I'm pretty sure its safe but I dont want the site to go down, even for a second, so figured I would ask first.0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Client has country specific (.co.uk) but no international domain name
Is there any issue re: SEO? I usually go for a local and international domain just to protect brand but these guys only protected local .co.uk domain. I'm thinking there aren't any specific SEO implications, but please let me know if there are any...
Intermediate & Advanced SEO | | McTaggart0 -
Need some urgent Panda advice. Open discussion about recovering from the Panda algorithm.
I have a site that has been affected by Panda, and I think I have finally found the problem. When I created this site in the year 2006, I bought content without checking it. Recently, when I went through the site I found out that this content had many duplicates around the web. Not 100% exact, but close to. The first thing I did is ask my best writer to rewrite these topics, as they are a must on my site. This is a very experienced writer, and she will make the categories and subpages outstanding. Second thing I did was putting a NOINDEX, FOLLOW robots meta in place for the pages I determined being bad. They haven't been de-indexed yet. Another thing I recently did is separate other languages and move these over to other domains (with 301's redirecting the old locations to the new.) This means that the site now has a /en/ directory in the URL which is no longer used. With this in mind I was thinking to relocate the NEW content, and 301 the old (to preserve the juice for a while.) For example: http://www.mysite.com/en/this-is-a-pandalized-page/ 301 to http://www.mysite.com/this-is-the-rewritten-page/ The benefits of doing this are: decreasing the amounts of directories in the URL getting rid of pages that are possibly causing trouble getting fresh pages added to the site Now, the advice I am looking for is basically this: Do you agree with the above? Or don't you agree? If you don't, please be so kind to include a reason with your answer. If you do, and have any additional information, or would like to discuss, please go ahead 🙂 Thanks, Giorgio PS: Is it proven that Panda is now a running update? Or is it still periodically executed?
Intermediate & Advanced SEO | | VisualSense1 -
Recommendation to fix Google backlink anchor text over optimisation filter penalty (auto)
Hi guys, Some of you may have seen a previous question I posted regarding a new client I started working with. Essentially the clients website steadily lost all non domain name keyword rankings over a period of 4-12 weeks, despite content changes and various other improvements. See following:: http://www.seomoz.org/q/shouldn-t-google-always-rank-a-website-for-its-own-unique-exact-10-word-content-such-as-a-whole-sentence After further hair pulling and digging around, I realised that the back link anchor text distribution was unnatural for its homepage/root. From OSE, only about 55/700 of links anchor text contain the clients domain or company name!....8%. The distribution of the non domain keywords isn’t too bad (most repeated keyword has 142 links out of the 700). This is a result of the client submitting to directories over the last 3 years and just throwing in targeted keywords. Is my assumption that it is this penalty/filter correct? If it is I guess the lesson is that domain name anchor texts should make up more of your links? MY QUESTION: What are some of the effective ways I can potentially remove this filter and get the client ranking on its homepage again? Ensure all new links contain the company name?
Intermediate & Advanced SEO | | Qasim_IMG
Google said there was no manual penalty, so not sure if there’s any point submitting another reconsideration request? Any advice or effective experiences where a fix has worked would be greatly appreciated! Also, if we assume company is "www.Bluewidget.com", what would be the best way to link most naturally: Bluewidget
Blue widget
Blue widget .com
www.bluewidget.com
http://www.bluewidget.com....etc I'm guessing a mix of the above, but if anyone could suggest a hierarchy that would be great.0 -
Article Submissions - Still Worth it After Panda Update?
Are article submissions still relevant after the panda update? Many of these sites (ezinearticles) are still hit from the panda update.
Intermediate & Advanced SEO | | qlkasdjfw0