Hi Vic,
Where is this occurring when you see it? Is Google perhaps re-writing?
-Andy
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Vic,
Where is this occurring when you see it? Is Google perhaps re-writing?
-Andy
Hi Susan,
First of all, I would want to know what is causing the duplication before I would advise on how best to handle it.
Is it that you have pages that are both www and non-www?
For example, do you have http://newgreenair.com/website/blog/ and http://www.newgreenair.com/website/blog/?
-Andy
Hi Becky,
If the links are justified, don't worry. I have clients with 3-400 and no problems with their positions in Google.
That doesn't mean to say it will be the same case for everyone though - each site is different and sometimes you can have too many, but just think it through and if you come to the conclusion that most of the links aren't needed and are stuffing keywords in, then look to make changes.
But on the whole, it doesn't sound like an issue to me - there are no hard and fast rules around this.
-Andy
Hi Sandi,
It will be almost impossible to see who has done this, but you have two options.
Disavowing is the solution most will take because it can be a lot quicker than trying to get hold of people at sites like these.
It is something you will have to stay on top of though - keep a check in Search Console for any new ones that pop up that shouldn't.
-Andy
More from a time factor than anything else, I would be spending my efforts on pushing videos on YouTube. Google tends to rank these well - I wonder why!
There are a lot asked this question over the last couple of years and I tend to offer the same advise - YouTube.
Others might have some different views and insights though.
-Andy
Hi Leszek,
No, will make no difference which level of SSL you have, as long as you have it.
-Andy
That sounds very likely Matt. As we know, Google is doing a lot more with local results now. We now have to contend with 4 ads and then a local pack. Wait until we get the ads in the local pack as well!
With it only being fairly minimal drops, I think you would have to also take into account that it might not be what you are doing, but what other sites have done. If any other sites have done a big push, it might just have pushed you down a little. Always a difficult call that one though, but I do see it a lot.
-Andy
Hi,
Do you mean that when you do cache:www.site.com that it shows no cache? Sorry, I don't quite understand.
-Andy
Hi,
With just 2-3 places, do you feel this is anything more than just daily fluctuation, or are positions creeping even further down?
What are you using to see new links? OSE isn't always doing to be able to capture new links so quickly. How much focus is given to the page content?
-Andy
Marcus has given you some good pointers there and while there does appear to be a small benefit in putting your keywords into a URL, it isn't something I would change just to do so.
In terms of how should a URL look, it depends on what makes the most sense for the products / pages. If you have a shop, then you might want to break it down to categories and products - if not, then a flat structure will probably work better.
Keep is straightforward, informative but never stuff it for the sake of trying. Shorter URL's are better where you can, but don't aim for a short one if it misses the point.
-Andy
Hi Omar,
You really shouldn't see that much, but any drop should only be temporary. How long ago did you make the change? 50% does seem on the high side - I would expect a temporary drop of around 15% but have seen higher.
Did you update your sitemap as well?
-Andy
**They are purely for links. So do i directly disavow them? **
Yes, if there is no way to do anything about it, just disavow.
Should i go ahead and disavow the bad links? Or let it be?
Yes, these sorts of links will be doing you no favours at all. OK, Google might just be ignoring them, but much better to err on the side of caution. If Google was just ignoring them but then changes their stance at a later stage, you know you are covered.
-Andy
Hi,
** I myself agree that some content is poor and thin and there is a problem of plagiarism also**
This is a biggie - Page quality is a huge thing for Google and you must ensure that pages someone lands on, have a reason to exist, and answer the question they asked correctly.
If you know there are spammy links, just how spammy? Are they damaging? Do they link with money-phrases? Has there been any penalty on the site at any time? All these should be looked at (and more) when assessing a backlink profile.
If you know there are links that are so poor that they could be damaging, either ask for them to be removed, or disavow them.
-Andy
When you say the site is under-performing, are you talking just in terms of search positions or once you get visitors there as well? Is the UX all in order and have you completed tests to make sure people are navigating their way around correctly?
-Andy
Hi Christopher,
If all you are talking about is a few thousand pages, I wouldn't worry about it. Google handles enormous sites that have hundreds of thousands of pages added and removed with huge frequency. I can't think of any reason why this would be a negative hit if all you are doing is correcting your content.
-Andy
No, you need to redirect all URLs...
Have a read of this on using the change of address tool in Google.
-Andy
Hi Donald,
The very best starting point is by reading the Google guides on moving a site.
https://webmasters.googleblog.com/2008/04/best-practices-when-moving-your-site.html
I don't actually know about plugins for Wordpress that will do this but your solution is going to lie around making the required changes in your .htaccess file. There is a lot of information out there on how best to do this, but this is what you need to be looking at.
-Andy
Just FYI, noindexing to handle duplication isn't really best practice now and isn't advised by Google. Rel=canoncial is there to help with duplication issues.And you certainly don't want to be implementing iframes to your site to try and get around this
-Andy
How are your images being fed into the site? Are you using a CDN?
-Andy
Hi Julien,
I always start with robots.txt in these cases, but that looks OK.
Is anything being blocked by JS? Something else to look at is if you are using something like Wordpress, there are plugins that can block access to these without you realising.
Looking at the URL of the image, this appears to be hosted on a 3rd party site?
-Andy
Hi JMB,
What you have done sounds like it makes sense. A little awkward to visualise, but reducing duplication / similar pages through the use of rel=canonical is the right thing to do.
Aside from search console, was there a drop in the SERP's that made you want to do something about this?
Regarding SEO, don't worry about your site having so many canonical pages. The root page is there for indexing while the canonical pages are needed for the user - it sounds to me like you are doing the right thing.
-Andy
Hi,
I haven't used Litespeed myself, but where is the main store view targeted to? Have you implemented HREFLANG to signal to Google which pages are for which audience?
-Andy
Hi,
Really you want to be looking at something like Ahrefs where it will show you lost and found links. I don't know offhand if OSE has that facility.
You might find that your drop is because of sitewide links that have been removed, but without thorough investigation, I could just be guessing. It could be a coincidence that the drop is related to the removal of the links.
Has anything else been done to the site that could account for this? Do any date drops coincide with a Google algorithm tweak?
-Andy
This is hugely dependent upon what you want to do. There is certainly no outright SEO benefit to doing this.
Personally, I don't like naked domains and much prefer the www. version.
Have a read of this to see if you can gain any additional information.
-Andy
The answer is whichever one works best for your particular situation.
Yes, while fewer folders is a benefit, it might not make sense to do this if you need to separate products, for example.
Try to keep URL's short, sensible and straight to the point - don't add keyword bloat and never stuff the URL's.
-Andy
Have you thought about using Cloudflare? From what I can understand, it can help block malicious traffic.
There is no telling exactly what Google is objecting to - if the site keeps dropping, this could be a major issue.
-Andy
Just FYI, I think you will find the robots.txt is fine
Just to test this, I used a couple of the online testing tools to confirm:
http://technicalseo.com/seo-tools/robots-txt/
http://tools.seobook.com/robots-txt/analyzer/
For peace of mind, I would check this in Search Console and use the Robots.txt tested in there also.
-Andy
Hi Simon,
You are seeing the same as lots of others at the moment - many of my own clients have noticed impacts as well - especially so if you get a local result that has 4 ads, then 3 local results, and Google testing ads within the local pack as well. Soon 1st place organic will be 2nd page!
I'm not saying this is what is going to be causing your issues, but it is likely and without looking into this in more detail, it's hard to say what else might be causing you issues.
-Andy
I don't know if it's related, but I can't actually get to the site at the moment.
www.enallaktikidrasi.com’s server <abbr title="DNS is the network service that translates a website’s name to its Internet address.">DNS address</abbr> could not be found.
-Andy
Hi,
Well, the page is indexed by Google (just do a cache: before your URL) but useful indexing can take any amount of time. How old is the site / page?
I would also check your site in Open Site Explorer because it does show a rather high spam score and you might want to focus on building some links to your site as well.
There could be many other things going on, but I am guessing that having a low PA / DA is going to be a big factor for you.
-Andy
You're very welcome
-Andy
As long as you use cross-domain rules, you will be absolutely fine - you are far from the only person who has a need or desire to do this
-Andy
Ahh OK you didn't say that - you only talked about speed.
One thing I can tell you is that a subdomain isn't going to help rankings - in fact, research has shown that a folder structure has been the best way to go, but doesn't sound like it might work as well.
At the end of the day, as long as you setup your HREF LANG correctly across domains, you will be fine and you won't get penalised.
Here is some required reading for international SEO:
https://moz.com/blog/hreflang-behaviour-insights
https://sites.google.com/site/webmasterhelpforum/en/faq-internationalisation https://support.google.com/webmasters/answer/182192 https://support.google.com/webmasters/answer/189077
-Andy
Who is your host? Some are starting to crack down on crawlers across the board with many blocking the likes of SEMrush. Might be worth checking if there are no other technical issues.
-Andy
Hi Stephane,
If all you are looking for is a speed boost, you could always use a CDN like Cloudflare to deliver content to your India visitors. No need to create another site
-Andy
Hi,
You are in the same boat as countless others waiting upon the next release of Penguin.
My thoughts around this are that it is probably a huge piece of work for you, and one that means you are going to have to rewrite all of the content again - but more of a worry, is if Google catches on to what you are doing. In their eyes, they might see that the business has two sites while there is only a need for one, and with that might come many more issues.
You wouldn't really want to be releasing a site and then have to disallow all crawler access as that wont do anything for it.
If you were to invest in a new site, I would be advising you switch the old one off before the new one was open to being crawled.
However, Google have said that Penguin isn't far off now - but they have been saying that for a while.
-Andy
Hi Issa,
There are actually very few reasons to noindex / nofollow pages these days as most issues can be handed though 301 or canonical, so if there is an option that allows this, do it.
From what you are saying, and as long as I am understanding correctly, this is just like an e-commerce site that has page 1, page 2, page 3, etc of the same product, and they all rel=canonical back to Page 1 - which is the right thing to do.
This tells Google that you know there is duplications and not to pay any attention to them, so while they are open for Google to see, it means that you won't get penalised.
So you have Job 1, Job 2, Job 3, etc, with a rel=canonical back to Job 1.
Here is a little extra reading from Google:
https://support.google.com/webmasters/answer/139066?hl=en
I hope that helps?
-Andy
Hi,
Basically, you can't have 2 page competing for the same keywords - this will cause you issues in Google and will end up with just one or the other being ranked. There really isn't a way around this. You could have the new page as a link off the existing page though and promote it that way as an "or how about this" feature?
-Andy
I don't tend to agree with John really because there is absolutely no harm in doing some outreach to get links built. Outreach is an important part of any linkbuilding campaign.
As for the other part of your questions, try not to focus on the number of links, but do spend time making your page as good as possible to help make your page as attractive to link partners as possible. There is no such thing as the right number of links to not annoy Google, but if it doesn't look natural, you will end up with a slap in the form of a manual action.
Focus on your page and ask yourself "would I want to link to this?". What is it about the page that makes it such an amazingly useful linkable asset that others will want to know about.
When performing outreach, do research each site because a cold e-mail is unlikely to get you anywhere.
Have a read of these guides to help you get along.
https://moz.com/beginners-guide-to-link-building
http://pointblankseo.com/link-building-strategies
I hope this helps.
-Andy
Ahh I see.
Sorry then, this isn't something I have actually tried for myself or any clients so don't have any data to give you that would be useful I'm afraid.
-Andy
Hi Lise,
It isn't best to do this because Google won't give you any benefit for doing so (or very minimal at best), but what you need to do is to redirect the old URL to the new one (if you haven't already) as this will show Google that there is a new page in place. Otherwise you are just starting with a new page.
Don't change it back, but don't do this to other pages, unless you are making them friendly URL's.
-Andy
Ah no problem at all.
What I would suggest, is having a look at the following articles:
http://sproutsocial.com/insights/facebook-advertising-guide/
https://adespresso.com/academy/guides/facebook-ads-beginner/
http://www.wordstream.com/blog/ws/2014/01/30/facebook-advertising-tips
Each will give you some valuable information and you can take it from there.
-Andy
Hi,
Are you talking about Facebook advertising?
-Andy
Hey Edward,
I would be hugely surprised if you got anything from doing this at all, but the only way to tell what it will do for you, is to test it. I highly doubt you would get any sort of a penalty for doing this, but I certainly wouldn't roll it out across the site.
-Andy
Absolutely agree with Tim. Just having microsites like this hanging around could cause you problems. If they are inactive, just turn them off and don't even redirect them. If these sites have ever had a penalty, you might end up passing something down the line to your main site.
If you have a number of sites that you control with the sole aim of trying to prop up the SEO of your main site, then what you have is a small PBN (Personal Blog Network) and these are absolutely against Google's TOS.
-Andy