Is that Penguin 2.0 or just temporary situation?
-
Hey,
i changed my hosting provider because better server hardware on 22 May. And many results dropped on google! My website opened only just 17 Jan 2013.
Maybe you want to look my anchors. you can find attachment image. Total backlinks: 4,9K
Is that temporary situation coz changing ip address (hosting provider) or penguin 2.0?
-
If your back link profile is good it's just coincidental that your traffic drop, your IP address change, and Penguin2 all happened on the same day, then wait it out--obviously you're doing something well to have all those back links. If your link profile is on the low quality side, it's probably time to start cleaning house and doing the things penguin victims have to do. Here's a little more info on the May 22nd penguin update.
-
What you suggest now?
waiting little bit more? or making somethings?
-
Almost 5K back links in 5 month is not bad as long as they're coming from legitimate sources and the growth is consistent. If I were going to make a guess as to whether it was the IP address change or the back links that were the root of the problem, I'd guess the back links.
-
Interesting, thanks for the followup. I've not heard about this signal.
Seems like that was over a week ago, and the reason you reposted is b/c you want to confirm whether the IP address is really impacting the rankings.
Lauren Vincent sounds like she has had some experience with it -- I recommend contacting her to ask the severity and duration of these penalties.
The second response to your original post suggested that the penalty should not be significant. So something is up, and it's best to contact someone who knows!
(I also recommend editing the post from today to include the original post)
-
Hey Shu,
thanks for response.
this link my first question on moz and Laurean Vincent said this is normal for changing ip address.
http://moz.com/community/q/ip-address-changed-and-some-rankings-drop
-
Changing your hosting provider does not impact your ratings in any significant manner. Something significant-- like changing your domain/subdomain, or a bunch of URL's--would be necessary for Google to ding your site.
Since you pointed out that you changed hardware right around when Penguin came out -- I would say the culprit is Penguin.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO friendly H1 tag with 2 text lines
Hi everyone, I am trying to add span tags in H1, break tag on 2 lines and style each line of H1 differently: Example: Line 1Line 2 I might add a smaller font for line 2 as well... Is this SEO friendly? Will crawlers read entire text or can interfere and block it. Thank you!
Intermediate & Advanced SEO | | bgvsiteadmin0 -
Why do people put xml sitemaps in subfolders? Why not just the root? What's the best solution?
Just read this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location Yet surely it's better to put the sitemaps at the root so you have:
Intermediate & Advanced SEO | | McTaggart
(a) http://example.com/sitemap.xml
http://example.com/sitemap-chocolatecakes.xml
http://example.com/sitemap-spongecakes.xml
and so on... OR this kind of approach -
(b) http://example/com/sitemap.xml
http://example.com/sitemap/chocolatecakes.xml and
http://example.com/sitemap/spongecakes.xml I would tend towards (a) rather than (b) - which is the best option? Also, can I keep the structure the same for sitemaps that are subcategories of other sitemaps - for example - for a subcategory of http://example.com/sitemap-chocolatecakes.xml I might create http://example.com/sitemap-chocolatecakes-cherryicing.xml - or should I add a sub folder to turn it into http://example.com/sitemap-chocolatecakes/cherryicing.xml Look forward to reading your comments - Luke0 -
Penguin and 301 redirects...
Hi, I have several questions about starting a new domain due to Penguin. The site is: http://bajajlaw.com. Quick backstory: This site was hit every time Penguin rolled out. No clean-up was done until October 2015. At that time, I took over the project. My efforts include: (1) Remove'em, (2) manual removal, (3) and the Disavow Tool. The HP went from being at around #50 for the target KW (San Diego criminal defense attorney) to about #25. Never really moved higher than that. However, I redid the content for the internal pages (DV, Theft Crimes, etc.) and they are all ranking fairly well (first page or top of 2nd). In short, the penalty only seems to affect the HP, not the internal pages. Instead of waiting for Penguin to roll-out, client wants to move forward with new domain. My questions are as follow: 1. Can I use the same content for the internal pages and 301 from the old internal pages to the new? 2. Should I 301 from the old to the new domain for the HP, or not? 3. If I do a 301 from an internal page to a new internal page, does that have the same effect of doing a 301 from the old HP to the new HP? I have read various opinions on this topic. I'd appreciate feedback from anyone who has experience doing this sort of thing. Thanks. P.s. I'm inclined to wait for P4 to rollout, but given that nobody seems to know when that might be, it's hard for me to advise client to keep waiting for it.
Intermediate & Advanced SEO | | mrodriguez14400 -
Product Pages & Panda 4.0
Greeting MOZ Community: I operate a real estate web site in New York City (www.nyc-officespace-leader.com). Of the 600 pages, about 350 of the URLs are product pages, written about specific listings. The content on these pages is quite short, sometimes only 20 words. My ranking has dropped very much since mid-May, around the time of the new Panda update. I suspect it has something to do with the very short product pages, the 350 listing pages. What is the best way to deal with these pages so as to recover ranking. I am considering these options: 1. Setting them to "no-index". But I am concerned that removing product pages is sending the wrong message to Google. 2. Enhancing the content and making certain that each page has at least 150-200 words. Re-writing 350 listings would be a real project, but if necessary to recover I will bite the bullet. What is the best way to address this issue? I am very surprised that Google does not understand that product URLs can be very brief and yet have useful content. Information about a potential office rental that lists location, size, price per square foot is valuable to the visitor but can be very brief. Especially listings that change frequently. So I am surprised by the penalty. Would I be better off not having separate URLs for the listings, and for instance adding them as posts within building pages? Is having separate URLs for product pages with minimal content a bad idea from an SEO perspective? Does anyone have any suggestions as to how I can recover from this latest Panda penalty? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Directory Listings with anchor text = Should we just Delete them all?
Hi Dr Moz'ers So our ex-contracted SEO "experts" have a enlisted our website to 40 odd directories with the same key word and copy. Is it best to just delete them all to avoid a possible negative SEO/Google outcome? Cheers
Intermediate & Advanced SEO | | supps0 -
Can we have 2 websites with same business name and same business address?
I have 2 websites with same business name and same business address, and obvious 2 different domain names. I am providing the same services from 2 websites. Is this is a problem?
Intermediate & Advanced SEO | | AlexanderWhite0 -
Penguin or paid link penalty, or both?
Hello, I have a site, macpokeronline.com, that has seen dramatic decrease in visitors in the last few months, it has went down from 800 per day to 200 per day. It is a pretty complex situation. The site owner purchased paid links from reputable mac sites for years (they were more of followed advertisements, but were only there for SEO Purposes), now that i'm going through the link profligate ins OSE, I can see that a majority of their links come from these sites. There is also a branding issue, there are almost 15,000 links with the anchor text of "macpokeronline.com" These are obviously branded links, I don't know the best way to deal with them (though the majority are coming from the paid link sites) We have just sent the request in to remove the paid links from the sites, and i'm guessing since he is paying over $1000 a month for the links, they will be removed quickly. The site has been receiving significantly less traffic since penguin (apr 24-25) We received a message on July 19th which was the generic unnatural link warning, saying that once we remove links make a reconsideration request. Then on July 23rd, we received another message that says they are taking a "very targeted action on the unnatural links instead of your site as a whole" which I have never seen before. This damage was done before I was hired by this client, I just want to get his traffic back up so I can help him even further, I want to know more about the steps I should take. 1. I will definitely remove the paid ads What else should I do, thanks Zach
Intermediate & Advanced SEO | | BestOdds0 -
Penguin Update Issues.. What would you recommend?
Hi, We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%. We suspect it's for a couple of reasons 1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week 2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle? 3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com. Any help will be much appreciated as this is Killing our business. Jay
Intermediate & Advanced SEO | | ConservationM0