Thanks so much guys.
Please keep more responses coming
Cheers,
-Andy
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Thanks so much guys.
Please keep more responses coming
Cheers,
-Andy
Morning all!
I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google.
For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up.
No right or wrong answers, just after a few numbers
Cheers,
-Andy
I noticed this ability had been removed about a week ago. I don't do much Social Media for clients these days, but that had been a real benefit.
-Andy
Hi Bob,
Even roughly, this isn't something I could do. If a click is worth $1000 and you have to pay $50 for it, you might consider it a worthwhile gamble. If it was going to cost you $400 for the same click value, you might not think so.
There are no general guides on this I'm afraid - just what you are happy with, within the realm of sensible costs that is. I can only advise that you try to get the biggest bang-for-buck. Choose phrases that are cheap but added together, will net you a decent amount of clicks for not very much money.
-Andy
Hi Bob,
There's a bigger picture here. What is it you are trying to do? Are these phrases that you are looking to match up to a page? In what way are you hoping to gain revenue?
-Andy
Hi Becky,
The two do use very different algorithms, but there isn't as much kicking around for Bing as there is Google, for obvious reasons.
In terms of tracking keywords in Bing, check out Accuranker, because you can differentiate.
I would also sign up to Bing Webmaster Tools (if you don't already) and have a look through the set of tools they offer. Diagnostic Tools might be a good place to start, as would the SEO Analyzer.
Other than that, there isn't an awful lot more that I can suggest because I only really focus on Google. I have always found that if a site does well in Google, it almost always does well in Bing as well. However, have a read through their Webmaster Guidelines as well.
-Andy
Hi Miriam,
It's the one that was on the phone. Does everything I need
-Andy
Hi Brian,
I have a client working on correcting this issue with his site at the moment. They run a big media site that allows access once paid, but so may of these sites suffer with the same issue and because they allow Google to index the whole text, but only show a portion of it, this means that if you just look at the cached version, you can read it without paying.
In terms of correcting it, I would first have a read on how Google handles subscription sites. You can find that info here. Google prefers the "First click free" model.
There is additional reading on this subject over at Search Engine Land. First Click Free (FCF) is what you want to be looking into in more detail.
I hope this helps a little.
-Andy
Hi Kathiravan,
Google's view on this is never to hide important information if it is something that you or your visitors rely on.
Lets say that you have 100 words that explain everything a potential buyer would need and then hid a further 300 words, but available after a click, then Google would assume that what is hidden isn't as important. With what you have on the attached image, I would question if you have given enough information there.
I like to look at this as what is best for the prospective buyer and give them as much as they need. Keep the important information high up the page and then depending on what else you need to include, you can decide if hiding it behind a tab keeps the page clearer, or can you add the full content below the fold as a scroll?
There are many factors with eCommerce sites - it's a much bigger discussion.
-Andy
H Ryosuke,
No, Google wouldn't see this is duplication. Information like this is seen as supplementary, but necessary to the user experience. I have a number of clients with exactly the same, and none of them have ever had any issues.
Some sites like to have this information on each page, others like to add it as a pop-up / modal window or on a single page linked from the product pages. All are perfectly reasonable. I am sure if I searched around that I would find Google saying the same thing - they will understand what it is, and just essentially ignore it.
-Andy
Interesting tool I haven't seen before
-Andy
Hi Tormar,
You tend to find that almost every site has an XML sitemap tied in with Search Console, but an HTML sitemap doesn't really do much for Google and is more of a navigation aid for people to find places on large sites. You certainly won't get a penalty or benefit if you do or don't carry one.
Add one if you feel it will be beneficial, but don't if not.
-Andy
Hi Oliver,
It does the same with my site, and although I have social icons throughout the site, I don't advertise an e-mail address because of spam.
Remember that OSE just takes a number of metrics and gives an estimation. This by no means is assuming that you are doing anything wrong, more that the crawler can't see where you address and details are - it might even be just 1 element of the address that is missing to it and it still reports it as a whole.
Don't worry about it at all.
Check your site in Google by doing cache:www.site.com and checking that Google sees your address. If it does, don't worry.
-Andy
Hi Luis,
You can actually do this via your .htaccess file where you would use an asterisk (*) as a wildcard and give the site expressions to 301 to another location, but this is going to 301 all pages to the same location, which I am assuming you don't want to do.
How many URL's are we talking about?
-Andy
Have a read through the latest Whiteboard Friday from Rand - you might pick up some good pointers from that
https://moz.com/blog/wrong-page-ranks-for-keywords-whiteboard-friday
-Andy
Hi,
No, data highlighter doesn't overwrite schema, but I can say that in every instance I have seen it tried, it has never worked. I always end up creating correct schema for a page.
Personally, I would be interested to know if anyone has ever got Google's Data Highlighter to actually work.
-Andy
My phone is constantly attached to me, more so than the ability to write anything down, so on the home-screen, I have my voice recorder. I say what I need to and then rename it. Takes me 2 seconds and means I forgot nothing - even when I am offline or have limited internet access.
The files stay there until they are no longer required.
-Andy
It's an interesting one and Paul has pretty much covered what most SEO's think about the use of bold for anything more than usability these days.
I actually performed split testing for this for a client a number of months ago (they like everything to be tested) and while the positions stayed static in Google, there was actually an increase in the amount of time that people spent looking at content with those bold words in them. I used HotJar to watch recorded user sessions.
Because of this, they use bold to emphasize key points now - A lot more came out of the testing around it, enough to make me realise just how much people scan pages for snippets of what they want.
-Andy
Hi,
While backlinks are a big factor, they are far from the only factor. It is as much about your content and usability as it is about links.
With the links, you could get just 1 really strong link that is much better than everything they have, and that might be enough - but I doubt it. It's just not that straight forward.
Best bet is to have a read though this MOZ guide on link building to get a bit of an intro into best practices and a more in-depth explanation.
-Andy
That's good news on both fronts.
Just remember with the disavow file, to add to the file that is there. It's too easy to overwrite a file and reverse work you have already done.
-Andy
Hi Monica,
It is horrible when that happens - just to check, have you secured the site now to prevent unauthorised access?
With regards the external anchor texts, these are going to be picked up by what the MOZ crawler finds, but it won't be an instant change. The crawler will have to revisit all of these sites to see links that have changed.
Don't worry about these still being shown in MOZ - it will take time, but if you still find external links to you, just build a disavow list.
Have you noticed any drop in the SERPs?
-Andy
Hi,
I suspect this is a MOZ thing. Don't worry about the fact it is appearing in an H1, that is just coincidental. There is no SEO rule based around this at all.
-Andy
Hi Radi,
Syndicating content isn't always bad if it is done right, and Google is pretty good at getting this right anyway. They know that you can't stop someone copying your content.
Have you checked to see if they are setting your pages as the canonical page? This is the way that syndication should work - ideally they would rel=canonical back to the page where the article resides on your site. If they don't, I would be asking them if they would mind doing that for you. You clearly don't mind good exposure, but I would call it a safeguard for both sites.
-Andy
If someone is trying to build links by interlinking footer links, they are going to be very disappointed.
If you read what I said again, it is highly unlikely that Google would penalise with a few footer links, but there is no clear cut black & white guarantee that nothing would happen with this. There are so many other factors that could play a part in any decision that Google makes.
To avoid doubt, no-follow them. The links are still present and the benefit to the users is still there.
-Andy
Hi Dave,
You are right to look at cross-domain canonical to resolve this. This is absolutely what it is intended for. Google gives you more guidance on how to ensure this is done correctly, here.
Will it affect anything else? Well, it shouldn't, but a little due diligence after the work has been completed will help confirm all is OK. An audit of the two sites will help you to see that everything is mapped correctly.
-Andy
Hi Gal,
I always advise in these cases, that the links are no-followed. This removes any doubt and concern over if Google will object. You are right to be cautious because you don't want any penalties to come from this, but it is highly unlikely that any would anyway. Footer links are largely ignored by Google.
There is a reason for the links to show to the different businesses, but I would just err on the side of caution.
-Andy
That's the wrong way to try and gain backlinks Geoff. You want to build links to your main site, but you want to be creating something in one place that is going to benefit you. An external site that then links back to you from duplicated content wont do it for you.
Good linkbuilding takes time, and it starts with your content - write something that is better than others have and take to social media to promote it.
I could honestly write a book about link building - it is that involved, but start by perfecting content that others will want to share and link to.
-Andy
Hi Geoff,
This isn't best practice at all. You don't want to have two or more copies of anything anywhere.
I don't even think that re-purposing any of the articles would be a good move.
Imagine that you are Google - if you see the same article in more than 2 places, what do you do? Penalise one site? Realise that both sites are owned by you and punish both sites? Don't show either article in a good position?
The only safe move here is to create amazing content for both blogs - but why have 2 blogs? Why double your effort when you could be using all the content to prop up the main site?
-Andy
Oh dear! That does sound like a bit of a mess.
First of all, don't not let them change your telephone number to a tracking one. If everything about you doesn't match up, then this will work against you. Google is very specific about this.
You also don't want lots of websites about you that all say the same thing. The advise that Google allows 10 websites is absolute trash. If they are duplicating information about you, then they are going against Google's guidelines.
My advice, yes, cancel. There is nothing I have read there that doesn't fill me with dread. These sorts of places that rattle through clients in this manner, don't care about being unethical or going against best practices. There is absolutely no substitution for a sensible SEO campaign where you aren't competing against your own site.
-Andy
I would ask on the Squarespace forums then rather than here - you are more likely to find others who have had these same issues and if there is anything more up to date, you will find out.
-Andy
Hi,
I heard RSS Feed helps in ranking.
No, this is not a ranking factor. I have never heard of or seen any research about this either. Think about all the sites that rank exceptional wall without one.
Use it if you seed your content around, but don't think it will do anything for your SEO.
-Andy
Hi,
Having never used Squarespace, I haven't ever come across this issue, but a quick search came up with this and a workaround, I understand.
-Andy
Hi Stacey,
When I inspect the page, I can see that there is something in your header that seems to be injecting this. It appears to be at the end here:
If I were you, I would check out the header file first to see if you can see the 'void' bit. It isn't supposed to be there so might be something that has been left there unintentionally.
-Andy
Hi Karl,
Google doesn't mind a bit of content that is the same page on page - they have said so in the page. If the page requires that there is a bit of the content that is duplicated, as long as it serves a purpose, you will be absolutely fine.
Look at the pages and see if what is there would be considered beneficial to the page.
The issues for duplication tend to arise when you are copying huge swathes of text from one page to another, plagiarising others work or even spinning.
By what you have said, it sounds like you will be absolutely fine, but feel free to post an example here or PM it to me if you wish and I will happily take a look
-Andy
Hi,
Do nothing until Google has rolled this out and confirmed that they are going to go with it. You don't want to make lots of changes if you only have to change them all back again - this could cause you a number of issues.
-Andy
Hi Andrew,
I would definitely avoid throwing everything at Google all at once. This won't give any article time to gain traction and severely limit your chances to share everything through social channels.
There isn't a magic timescale where you should publish this over, but if there is that much, then you should be looking at months rather than days or weeks.
Leave the season-sensitive articles until those seasons to maximise on the impact they can have.
I would also update any articles that might have out-dated information, so look at these before they go live.
-Andy
Hi Kim,
It's very difficult to imagine so many possible scenarios of why this could be happening, Have you looked into any canonical issues? This if often a problem with e-commerce sites.
You would want to make sure that if there is a worry of duplication, that this is being handled correctly.
-Andy
Ria gave a great URL to read there, but unless I missed them, it's missing a few points, like:
There are a lot more that I would ask, but these are ones I couldn't miss.
-Andy
Hi Rick,
If you noindex a page, Google can and will still crawl it. They just won't index it.
If you disallow the page, then Google just won't see it. This is the safer option if you have other similar landing pages.
-Andy
Hi Jonathan,
When I visit your site and I go to www.swiftcomm.co.uk/, it redirects me to www.swiftcomm.co.uk, so this looks OK.If I check the Google cache for both versions of the URL, the only one cached is www.swiftcomm.co.uk/ - this looks fine to me.
Might I suggest you download Screaming Frog and run a crawl with that as well, just to be sure. I can't do this today myself as I am on a Chrome box.
-Andy
You can't use how a site looks or feels to you to get an indication of if it is safe or not...
You need to be checking, aside from the MOZ spam score, through something like SEMrush, what their traffic is like. Have they had any penalties that might be affecting them?
Use The Wayback Machine as well to check the history of the site. This can tell you if it was ever a site you would rather not be associated with or if it has been recently purchased because it has good history and links. Expired domains can surface very frequently and people will charge for gues blogging and lins etc.
You should also check who else links to the site. Are there reputable sites that link to it for a good reason?
Just a few more tips to keep an eye on.
-Andy
Hi Susan,
One thing I have found to be effective with sitelinks, is internal linking.
Create an internal linking campaign to target more suitable pages and you can find these changing. That said, it's a bit odd that they still haven't been removed from Google.
I am about to Tweet John Mueller at Google with this thread to see if he has any input at all.
-Andy
Hi Rick,
I would be creating custom landing pages, but I would disallow them via robots.txt rather than noindex them. That way you can remove the pages without issue at any stage.
Doing this wont give Google anything to frown about.
-Andy
No problem at all
-Andy
When did you notice that this started? Are there any dates that coincide with anything? Is it so significant that you can't just put it down to the number of users on the different devices?
Have there been any changes to the site that might have affected the tracking code at all? Could it be an issue with reporting in Analytics?
-Andy
Hi Allie,
I think I understand what you are saying.
Just to clarify, are you asking if optimising the canonical page the right thing to do? It sounds to me like what you are doing is correct. No-indexing the category pages will help prevent duplication, or you could also canonical those back as well.
-Andy
Hi Mark,
You can do this with URL Profiler. it will analyse the site / pages you tell it using Google Pagespeed.
-Andy