We seem to have been hit by the penguin update can someone please help?
-
HiOur website www.wholesaleclearance.co.uk has been hit by the penguin update, I'm not a SEO expert and when I first started my SEO got court up buying blog links, that was about 2 years ago and since them and worked really hard to get good manual links.Does anyone know of a way to dig out any bad links so I can get them removed, any software that will give me a list of any of you guys want to do take a look for me? I'm willing to pay for the work.Kind RegardsKarl.
-
Also Karl, Alexa does a fairly good job at tracking inlinks with referring URL's which is pretty helpful.
Your site is here: http://www.alexa.com/siteinfo/wholesaleclearance.co.uk
And of course Open Site Explorer here will give you some details as well.
-
Those are very good questions.
I don't know of any tools to help determine if a link is good or bad. I generally look at the site and know if I want to be there, but when you're dealing with thousands of links it would be awful nice to have a tool to help.
-
Thankyou for your reply donford, ill take a look. Do you know how I will know what is classed as a spammy link and will I be able to find out what ones are?
-
HI Karl, do you have a Google webmaster account? This should show you who Google sees as linking to you.
For EN-US .. sign into Google Webmasters, click the Your Site On the Web nav link, then links to your site. (It maybe called something different for EN-UK). You can export them from there. There are several free tools that will check your inlinks but since you referenced the penguin update as the problem then I'd use Google for my source.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HELP!! We are losing search visibility fast and I don't know why?
We have recently moved from http to https - could this be a problem? https://www.thepresentfinder.co.uk As far as I'm aware we are doing everything by SEO best practice and have no manual penalties, all content is unique and we are not doing any link farming etc...
White Hat / Black Hat SEO | | The-Present-Finder0 -
SERPs Help
Hey Mozzers, Please can someone advise? I manage the on-line content for an estate of Gyms in the UK. We had an existing gym location in Birmingham - www.nuffieldhealth.com/gyms/birmingham and 5 months ago we opened a new location in Birmingham - www.nuffieldhealth.com/gyms/birmingham-central. The 2 pages have different in-page content, different H1's, different title tags, different citations in page both have a few back links from different root domains, however the 2nd page (birmingham-central) does not rank in the top 50 results even though our domain is strong that the vast majority of results? Our original page (/gyms/birmingham) also slipped from page 1 in SERPs to the bottom of page 2 when the second Birmingham gym page was deployed?? I am guessing Google does not know which page to serve in SERPs, bud i am at a lose as to how to fix this issue. Can anyone please advise?? Regards Ben
White Hat / Black Hat SEO | | Bendall0 -
Google Panda and Penguin "Recovery"
We're working with a client who had been hit by Google Panda (duplicate content, copyright infringement) and Google Penguin (poor backlinks). While this has taken a lot of time, effort and patience to eradicate these issues, it's still been more than 6 months without any improvement. Have you experienced longer recovery periods? I've seen sites perform every black hat technique under the sun and still nearly 2 years later..no recovery! In addition many companies I've spoken to advised their clients to begin right from the very beginning with a new domain, site etc.
White Hat / Black Hat SEO | | GaryVictory0 -
Our site has too many backlinks! How can we do a bad backlink audit?
Webmaster Tools is saying we have close to 24 million links to our site. The site has been around since the mid 90s and has accumulated all these links since. We also have our own network of sites that have links in their templates to our main site. I'm fighting to get these links "nofollow"'d but upper management seems scared to alter this practice. This past year we've found our rankings have dropped significantly and suspect it's due to some spammy backlinks or being penalized for doing an accidental link scheme network. 24 million links is too many to check manually for using the disavow tool and it seems that bulk services out there to check backlinks can't even come close. What's an SEO to do?
White Hat / Black Hat SEO | | seoninjaz0 -
Will cleaning up old pr articles help serps?
For a few years we published articles with anchor text backlinks to about 10 different article submission sites. Each article was modified to create similar different articles. We have about 50 completely unique articles. This worked really well for our serps until google panda & penguin updates. I am looking for advice on whether I should have a major clean up of the published articles and if so should I be deleting them, removing or renaming anchor text backlinks? Any advice on what strategy would work best would be appreciated as I don't want to start deleting backlinks and making it worse. We used to enjoy position 1 but are now at 12-15 so have least most of our traffic.
White Hat / Black Hat SEO | | devoted2vintage0 -
Penguin link removal what would you do?
Hi Over the last 4 months I have been trying to remove as many poor quality links as possible in the hope this will help us recover. I have come across some site's that the page our back-link is on has been de-indexed, goggle shows this when I look at the cached page... 404. <ins>That’s an error.</ins> The requested URL /search?sourceid=navclient&ie=UTF-8&rlz=1T4GGNI_enGB482GB482&q=cache:http%3A%2F%2Fforom.eovirtual.com%2Fviewtopic.php%3Ff%3D4%26t%3D84 was not found on this server. <ins>That’s all we know.</ins> If goggle is showing this message do I have to still try to remove the link, or is it a case goggle has already dismissed the link?
White Hat / Black Hat SEO | | wcuk0 -
"Unnatural Linking" Warning/Penalty - Anyone's company help with overcoming this?
I have a few sites where I didn't manage the quality of my vendors and now am staring at some GWT warnings for unnatural linking. I'm assuming a penalty is coming down the pipe and unfortunately these aren't my sites so looking to get on the ball with unwinding anything we can as soon as possible. Does anyone's company have experience or could pass along a reference to another company who successfully dealt with these issues? A few items coming to mind include solid and speedy processes to removing offending links, and properly dealing with the resubmission request?
White Hat / Black Hat SEO | | b2bmarketer0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0