3 month old site lost almost complete traffic overnight
-
Hi All,
I started a Indian coupon and deal site http://www.couponspy.in/ around 3 month ago and traffic increased almost daily. But yesterday my site lost almost all of its traffic. Keywords which ranked 1-5 lost around 4-15 places and keywords which ranked 6-20 lost ca. 20-50 places. The Moz Crawl Diagnostics doesn't indicate any mayor issues. Has there been a Google Panda update in India? Reasons why my site has been affected? Please help!!!!
I have seen the same traffic decrease on other coupon start ups, eg https://www.cuponation.in/ and https://www.cuponation.in/ Did we all make the same mistake? Any guesses?
-
It looks like you have a few followed site-wide footer links; the type I advice people to watch out for. Then there are a couple press release links passing equity. I wouldn't be surprised if Google simply devalued some links. Keep doing great work and adding unique value for your users, and avoid link schemes. You'll get there eventually
-
Hi,
If it is a new site it is not unusual for it to appear in pretty high positions and then fall back down once google has decided how it wants to sort/rank things. Quite possibly the same thing is happening with the other start ups you mention if they came online around about the same time. Lots of new sites aimed at similar keywords is certainly going to give the search engines a bit of food for thought in terms of how/where to rank. If you keep adding good content, attracting decent links and if the site is strong and the technicals are good you should see your rankings moving up again in the medium term.
Also worth mentioning that the coupons/deals sector can be pretty competitive and has a tendency to thin content and spammy link building sometimes, worth having a look at your top competitors and trying to see what they seem to be doing that is working for them. Did your site get caught up in a panda update? Maybe, but the way to approach it doesn't really change, address thin content issues and create good content wherever possible and naturally grow your link profile. Check out this post which is just as relevant now as it was two years ago!
-
Hi ParvatiSingh
It looks like you might have been hit by the latest Google algorithm updates. There have been several in the last 3 months such as Penguin 2.0. Unfortunately Google does not mention the international roll out dates. We only know when they roll things out in the US.
Carla
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
If I redirect based on IP will Google still crawl my international sites if I implement Hreflang
We are setting up several international sites. Ideally, we wouldn't set up any redirects, but if we have to (for merchandising reasons etc) I'd like to assess what the next best option would be. A secondary option could be that we implement the redirects based on IP. However, Google then wouldn't be able to access the content for all the international sites (we're setting up 6 in total) and would only index the .com site. I'm wondering whether the Hreflang annotations would still allow Google to find the International sites? If not, that's a lot of content we are not fully benefiting from. Another option could be that we treat the Googlebot user agent differently, but this would probably be considered as cloaking by the G-Man. If there are any other options, please let me know.
International SEO | | Ben.JD0 -
Subdomains or subfolders for language specific sites?
We're launching an .org.hk site with English and Traditional Chinese variants. As the local population speaks both languages we would prefer not to have separate domains and are deciding between subdomains and subfolders. We're aware of the reasons behind generally preferring folders, but many people, including moz.com, suggest preferring subfolders to subdomains with the notable exception of language-specific sites. Does this mean subdomains should be preferred for language specific sites, or just that they are okay? I can't find any rationale to this other than administrative simplification (e.g. easier to set up different analytics / hosting), which in our case is not an issue. Can anyone point me in the right direction?
International SEO | | SOS_Children0 -
Improving Search Rankings in other Countries for an existing site
Hello SEOmoz, I have a very well respected international client who ranks high in the US and for English language Google search results worldwide. However, the client's foreign language pages for specific countries do not show up on the first page of SERPs in those specific countries. The foreign nation/language pages are served on the same root domain as the main English language site it this fashion: www.client.com/france www.client.com/brazil Here are my questions: What can we do from an SEO standpoint to improve SERPs in Google.fr or other countries What is the best way to prevent duplicate content errors or prevent the wrong page from being indexed abroad. What are some best practices when using Google Webmaster tools in this regard? Thanks
International SEO | | BPIAnalytics0 -
How to optimise a site for 2 countries
Hi there - Any help with the below much appreciated I am helping an Australian company, producing packaging products for businesses. Their site is hosted in Australia and their offices are in Australia. They have asked me to take care of both on-page and off-page SEO so that they rank for keywords related to their products - e.g. 'cardboard boxes'. This should be fairly straightforward for Australian based (.com.au) searchers, but they also supply their products to South Africa, and so want their results to show up also for South African based (.co.za) searchers. Also consider: it is not typical for searchers for these products to use geomodifiers in their search terms there is no unique content for the South African market versus the Australian... the product information is essentially identical. What should we do to ensure their results show up equally for those in South Africa as well as Australia? I am considering building a completely separate site, hosted in South Africa and specifically for the S.A market, but will the duplicate content effect be an issue? Also, this would essentially mean double the SEO effort, is there no way I could achieve our goals more efficiently? many thanks to any help
International SEO | | dnaynay0 -
Non US site pages indexed in US Google search
Hi, We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results. We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure? Below are examples of two of our URLs for reference - one from Canada, the other from the US /ca/en/prod4130078/2500058/catalog50008/ /us/en/prod4130078/2500058/catalog20038/ If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem? Thank you. Angie
International SEO | | Corel0 -
How do I successfully verify my site for Baidu's webmaster tools?
Instructions for verifying a website via file validation for Baidu's webmaster tools are pretty vague. Does anyone know if the process is the same as Google Webmaster Tools where the verification string must appear in the URL and in the content of the file? Also, does it truly have to be verified within 2.6 hours? Appreciate any feedback from people who have successfully verified their site.
International SEO | | sigmaaldrich0 -
Multi-lingual SEO: Country-specific TLD's, or migration to a huge .com site?
Dear SEOmoz team, I’m an in-house SEO looking after a number of sites in a competitive vertical. Right now we have our core example.com site translated into over thirty different languages, with each one sitting on its own country-specific TLD (so example.de, example.jp, example.es, example.co.kr etc…). Though we’re using a template system so that changes to the .com domain propagate across all languages, over the years things have become more complex in quite a few areas. For example, the level of analytics script hacks and filters we have created in order to channel users through to each language profile is now bordering on the epic. For a number of reasons we’ve recently been discussing the cost/benefit of migrating all of these languages into the single example.com domain. On first look this would appear to simplify things greatly; however I’m nervous about what effect this would have on our organic SE traffic. All these separate sites have cumulatively received years of on/off-site work, and even if we went through the process of setting up page-for-page redirects to their new home on example.com, I would hate to lose all this hard-work (and business) if we saw our rankings tank as a result of the move. So I guess the question is, for an international business such as ours, which is the optimal site structure in the eyes of the search engines; Local sites on local TLD’s, or one mammoth site with language identifiers in the URL path (or subdomains)? Is Google still so reliant on TLD for geo targeting search results, or is it less of a factor in today’s search engine environment? Cheers!
International SEO | | linklater0