Do I have a Panda filter on a specific segment?
-
Our site gets a decent level of search traffic and doesn't have any site-wide penalty issues, but one of our sections looks like it might be under some form of filter. Unfortunately for us, they're our buy pages!
Check out http://www.carwow.co.uk/deals/Volkswagen/Golf it's unique content and I've built white hat links into it, including about 5 from university websites (.ac.uk domains DA70+). If you search something like "volkswagen golf deals" the pages on page 1 have weak thin content and pretty much no links.
That content section wasn't always unique, in fact the vast majority of it may well be classed as dupe content as there's no Trim data and they look like this: http://www.carwow.co.uk/deals/Fiat/Punto
While we never had much volume, the traffic on all /deals/ pages appears to drop significantly around the time of the May Panda update (4.0).
We're planning on completely re-launching these pages with a new design, unique trim content and a paragraph (c.200 words) about the model.
Am I right in assuming that there's a Panda filter on the /deals/ segment so regardless of what I do to one deals page it won't rank well, and we have to re-do the whole section?
-
It is still possible it isn't a penalty from one of the major algorithms and you may be able to solve this by creating a strong internal linking strategy. It helps to formulate one if you use something like MindNode to create an overview of the site and then you can drill in on the pages in questions.
It is possible that a noindex would cure this, but it all depends because even though you add a noindex tag to a page, Google can still read the page and apply the penalty. All it means is that the page won't be indexed.
However, if you are relaunching everything very soon, you might be as well to sit tight and not do anything too rash for a short-term solution.
-Andy
-
Positions as well.
Some form of filter is the only explanation I can think of for why that VW Golf Deals page doesn't perform. It's better content and has decent links (OSE hasn't picked them up but they're there).
We get c.40k hits/month on our blog and c.25k hits/month on our car-reviews, entirely through organic, but literally zero on the deals pages, where if anything the competition is less and the quality is lower.
I wonder if placing a no-index tag on the deals pages that have thin content would resolve the issue, but we'll be re-launching the whole segment in the coming weeks.
-
Hi James,
Am I right in assuming that there's a Panda filter on the /deals/ segment
Unfortunately there is no guaranteed way to say this is the case, but generally if you see a drop in traffic / positions that coincide with an algorithm refresh, then this can be telling.
Is it just traffic to those pages that has dropped, or positions in the SERPs?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Specific page does not index
Hi, First question: Working on the indexation of all pages for a specific client, there's one page that refuses to index. Google Search console says there's a robots.txt file, but I can't seem to find any tracks of that in the backend, nor in the code itself. Could someone reach out to me and tell me why this is happening? The page: https://www.brody.be/nl/assistentiewoningen/ Second question: Google is showing another meta description than the one our client gave in in Yoast Premium snippet. Could it be there's another plugin overwriting this description? Or do we have to wait for it to change after a specific period of time? Hope you guys can help
Intermediate & Advanced SEO | | conversal0 -
Can You Help Confirming That 1stWebDesigner Was hit by Panda 4.0 and PayDay Loans 2.0,3.0?
Hello, really hoped for your feedback here. So my site is http://www.1stwebdesigner.com I just used http://barracuda.digital/panguin-tool/ - and came to realization, that indeed we might have been hit by famous Panda 4.0 update. Here is screenshot from Barracuda tool - https://www.evernote.com/l/AELGaZ6nyxBE1aK7oQJVHBPKJuwjUh5JWk8 and in attachment there is Google Analytics screenshot for that time - especially May, 2014. Can you please help me confirm that we indeed have been hit with the Penalty? It has been 1.5 years already and since then, the traffic never has grown more. Before May, 2014 our site received 1.5 million pageviews/mo, after it..until this moment traffic has never been more than 600,000 pageviews/mo. Now if I look back even more with Barracuda tool, I can see it was also affected by Top Heavy 1.0 update,Thu Jan 19 2012. Even back then before all these updates site grew to 2.5 million pageviews/mo. What is the painful part I always focused on time-consuming link building. Submitting links to design news sites, leaving quality comments under related articles and always have written unique quality content on site for years. Can you tell based on screenshots and confirm we indeed have penalty. And maybe suggest what are the actions to take now? I have ahrefs pro subscription and started using Disavow tool to prepare for submission to Google. Would love to hear feedback on this..it has been painful throughout years. Update: here is one more screenshot from Barracuda looking from 2012 year to traffic - https://www.evernote.com/l/AEIX6iP4USBGtbyRLZ1oTmcACgBtXHBP2rw SpRPIyY
Intermediate & Advanced SEO | | researchninja0 -
How to make AJAX content crawlable from a specific section of a webpage?
Content is located in a specific section of the webpage that are being loaded via AJAX.
Intermediate & Advanced SEO | | zpm20140 -
Htaccess rewrite rule (very specific)
Hello, Awhile back my company changed from http: to https: sitewide (before i started working here). We use a very standard rewrite rule that looks like this: RewriteEngine On
Intermediate & Advanced SEO | | Waismann
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://opiates.com/$1 [R,L] However, with this rule in place, some http: urls are being redirected with a 302 status code. My question is, can I safely change the above code to look like this: RewriteEngine On
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://opiates.com/$1 [R=301,L] to ensure that every redirected is returned with a 301 status code. The only change is in the [R,L] section. Thanks to whomever can help with this. I'm pretty sure its safe but I dont want the site to go down, even for a second, so figured I would ask first.0 -
Using Webmaster Tools to Redirect Domain to Specific Page on Another Domain
Hey Everyone, we redirected an entire domain to a specific URL on another domain (not the homepage). We used a 301 Redirect, but I'm also wondering if I should use the Google Webmaster Tools "Change of Address" section to redirect. There is no option to redirect the old domain to the specific URL on the new domain within the "Change of Address" section. Thoughts?
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Can I delay an AJAX call in order to hide specific on page content?
I am an SEO for a people search site. To avoid potential duplicate content issues for common people searches such as "John Smith" we are displaying the main "John Smith" result above the fold and add "other John Smith" search results inside an iframe. This way search engines don't see the same "other John Smith" search results on all other "John Smith" profile pages on our site and conclude that we have lots of duplicate content. We want to get away from using an iframe to solve potential duplicate content problem. Question: Can we display this duplicate "John Smith" content using a delayed AJAX call and robot.txt block the directory that contains the AJAX call?
Intermediate & Advanced SEO | | SEOAccount320 -
HTML 5 sites, segmentation and Meta data?
Hello Mozers, I am currently building an HTML 5 site. I've run into a couple of issues. While implmenting segmentation in each of my mian menu iten, I am able to pluggin Meta data only for one segement (or the page). I am unable to inser Meta data for each of the segments. For example: I have (main menu) Services ----> Submenu (teaching, upgrading, Dancing) I can implement meta data for the Services but not for teaching, upgrading and Dancing as they are segment in the same page. Whats the best logic to get around this
Intermediate & Advanced SEO | | waspmobile0 -
How can I change my website's content on specific pages without affecting ranking for specific keywords?
My client's website (www.nursevillage.com) content has not been touched for 4 years and we are currently ranking #1 for "per diem nursing". They do not want to make any changes to the site in fear that it might decrease our rankings. We want to try to use utilize that keyword ranking on specific pages (www.nursevillage.com/nv/content/careeroptions/perdiem.jsp ) ranking for "per diem nursing" and try redirecting traffic or placing some banners and links on that page to specific pages or other sites related to "per diem nursing" jobs so we can get nurses to apply to our new nursing jobs. Any advice on why "per diem nursing" is ranking so high for us and what we can change on the site without messing up our ranking would be greatly appreciated. Thanks
Intermediate & Advanced SEO | | ryanperea1000