Can't believe you being an Interactive Marketing Manager fell for a cheap looking fake review affiliate site.
Except A Small Orange, the others are crap. As horrible as you can get in the hosting industry.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Can't believe you being an Interactive Marketing Manager fell for a cheap looking fake review affiliate site.
Except A Small Orange, the others are crap. As horrible as you can get in the hosting industry.
Not necessarily. Just remove it from the existing WMT account and ask the new owners to add it in theirs.
If you wish to leverage black hat for quick ranking improvements, I'd still suggest private blog networks. A few on BlackHatWorld are still effective. Google always has trouble handling 6 million+ domains of the SAPE network so I'd suggest that as well. Just make sure you don't buy links on a hacked site etc. which could lead you to legal complications.
Artificial but contextual links still work, provided you know what you're doing, and keep the spinning quality good (or manually spin them altogether) and focus about anchor text and other footprint variations.
Hey Bob,
How about making things a bit simpler for your potential customers? I'd suggest expandable bullet points, and you can include a piece of text within a point and have it function with the help of JavaScript. It'd look a lot more neat and clean.
I'd also show off others' testimonials, what others think about me and my services, and more importantly, what results I've achieved for others.
Best of luck!
Your process is pretty much on the right track. You might want to check these:
http://moz.com/blog/google-disavow-tool http://www.searchenginejournal.com/removing-16-month-google-penalty-hope-yet/
Swap the 302s (temporary redirect, no flow of PageRank) with 301s (permanent redirect, normal flow of PageRank). Same functionality. As googlebot won't be logged in, it will get 301 redirected.
Hey Gary,
I'm not worried about the main content on the site.
At the moment Google is like removing 10 URLs per day from their index. The cached copies are creating issues, I think. Forgot putting 'noarchive'. Surprisingly, also the crawl stats show that the average crawl rate decreased. I placed the noindex,follow tags on September 28th and till now Google merely removed 80-90 URLs from their index solely based on that (without manual intervention).
The behaviour might seem artificial to them. I have seen people use the 'noarchive' tag, but only when they want to speed up the removal (from Google index) process. Plus, I didn't entirely get what you're trying to achieve.
You can use Twitter + FollowerWonk or Google (the search engine) + custom queries.
You can type things like these on Google to find potential guest posting opportunities:
"keyword" + "guest post"
"keyword" + "write for us"
"keyword" + "submit your post"
etc.
You might be interested: http://findmyblogway.com/finding-guest-post-opportunities/ (discusses about some unique ways)
Nah, it doesn't matter. It's still the same as when you put it at the end.
Hey Lucas,
I'd suggest you to try WordPress SEO by Yoast and no-index especially archives, tags and if possible, categories. Once they're no-indexed, search engines won't care about duplicate content. It's the duplicate content in the search engines' indexes that they care about.
You also have to look for the root of the issue - what caused the sudden rise in the number of indexed pages.
I have a blog hit by Panda in 2011 and 2012. The thing is, I've no-indexed around 1000 posts out of 11xx. No-indexed tags and archives. But, Google was taking a very long time to remove them from their indexes. So, I had to do a manual removal from Google WMT. Removed /2011/ and /2013/ as directories, and removed /pages/ (this is an WordPress site) so all of them are now no longer in their index.
It was a smartphone blog started in 2011 which I turned into an tech blog on a new domain (I let the old PR3 DA 30+ domain expire and now someone's asking me $200 if I am to get it).
I had a team when it was a smartphone blog. Our articles had been featured on places like Engadget, PhoneArena, UberGizmo etc. So, with the loss of the domain, we've lost quite a few important backlinks as well.
Also, Authorship doesn't work for the site. The Rich Snippets testing tool says everything's all right, but it never really works / shows up on SERPs. I fear it's because of a penalty. It seems to me like no one has ever thought about a penalty that affects Authorship.
So, now you know the problem, and the things I did in order to fix it, could you tell me if:
Thanks in advance everyone!
Whether you care about SEO or not, redirect the non-preferred version to the preferred one (www or non-www) using a simple 301 redirection. If you have access to your web server, you can modify the .htaccess file at the root of your site. Google for the exact lines of codes to add (they depend on whether you are redirecting to the www or non-www version).
You can use Google Webmaster Tools to choose a preferred version of site link for Google.
-----Can you comment on whether this is a best practice for all domains?
Yes, it is.
-----I've run a report for backlinks. If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this?
You shouldn't worry about that at all. 301's are just fine. They don't only redirect visitors, search engines like Google also follow them to pass authority signals to the redirected page.
I don't think Google directly takes social media signals into consideration. More social shares generally indicate better rankings, but it's generally a co-relation: http://moz.com/search-ranking-factors
More social shares = more exposure = more chances of attracting links
It's really a choice of Googlebot.
Generally, rel=nofollow doesn't mean that Google won't crawl the linked page. It's just useful because when you add a rel=nofollow attribute to a link you don't pass PageRank, trust, authority or any other positive signal to the linked page.
Hi Jarrod,
The first thing I noticed, a lot of pages in your site don't contain a rel=canonical tag. For example, this one: http://www.partysuppliesnow.com.au/view-products/96/LED-Furniture
We know that Google is not particularly good at identifying the original source of a content. So, you can report the sites that scraped your content to Google (https://www.google.com/webmasters/tools/spamreport?hl=en). That'll let Google know about the issue and hopefully lift the penalty off your site and penalize the other site.
Another issue could be the Authorship setup on product pages. It's considered as Authorship abuse. Generally, you don't want to link a Google+ profile with a site's homepage and other generic pages.
I've had some experience with Panda. I can say no-indexing is very effective in fighting Panda. If you know about a significant number of low-quality pages in your site, that you wouldn't prefer to open as a searcher, you should add a meta no-index tag in the section of those pages. It takes some time to get out of the Panda box.
Regards,
Rohit