Hi Dan
I had missed that reply; cheers for the heads up (my email notification never came through). I'll talk to the devs about implementing to target all bots.
Thanks!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: Digital Marketing Manager
Company: Natural World Safaris
Hi Dan
I had missed that reply; cheers for the heads up (my email notification never came through). I'll talk to the devs about implementing to target all bots.
Thanks!
Many thanks for taking the time to respond and sorry for my slow response.
At present we feel that it could be seen as cloaking as we would be introducing specific code just for Google. This would change the render view in Search Console as well between Google / users.
Thanks
Kate
Hi
We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs>
Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript>
aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" >
Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
A problem has been introduced onto our sitemap whereby previously excluded URLs are no longer being correctly excluded. These are returning a HTTP 400 Bad Request server response, although do correctly redirect to users.
We have around 2300 pages of content, and around 600-800 of these previously excluded URLs,
An example would be http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/botswana/suggested-holidays/botswana-classic-camping-safari/Dates and prices.aspx (the page does correctly redirect to users).
The site is currently being rebuilt and only has a life span of a few months. The cost our current developers have given us for resolving this is quite high with this in mind. I was just wondering:
How much of a critical issue would you view this?
Would it be sufficient (bearing in mind this is an interim measure) to change these pages so that they had a canonical or a redirect - they would however remain on the sitemap.
Thanks
Kate
I'd take a look at Analytics and compare the periods before and after the drops, looking at landing pages filtered by traffic medium organic. This should show you which pages have seen the biggest loss, which in turn should give you an idea of what keywords have been effected.
If there are just a few pages then run through a technical audit checklist on these pages to make sure no errors have been introduced - for example a page hasn't accidentally been set to "no index" (I've seen it happen).
You could also compare average position of keywords between two periods in WMT to see where the biggest drops are.
Many thanks for taking the time to reply.
Ryan - I am talking to CMS Source about these options; it's not possible with the current configuration but it looks like we may be able to change this with a bit of development work.
Sean - Thanks, I had considered the canonical, which would at least prevent a duplicate content issue although I think we need to take measures to stop this admin. subdomain being accessible which this won't help with.
Lynn - Thanks, I'll do that as an interim measure.
Thanks
Kate
Hello Moz Community
Hoping somebody can assist.
We have a subdomain, used by our CMS, which is being indexed by Google.
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/
The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools
I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file?
It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property.
I've also asked the developer to add a password protection to the subdomain but this does not look possible.
What approach would you recommend?
Hi
I was hoping to get some thoughts and opinions on our blog. It is part of our main site (not on a subdomain) but performs very badly, pulling in very little organic traffic (only accounting for 0.6% of our organic traffic).
Every page of the blog is listed in our sitemap, and using Screaming Frog I've done spot checks of several pages to see if they are indexed, which they have been. Looking at Google's text cache, all the content is visible.
Pages are often well shared on social media (for example): http://www.naturalworldsafaris.com/blog/2014/10/antarctica-photography-safari-2014-updates.aspx
I'm aware that we do need more links coming into the blog but I still feel that it should be performing better than it is.
Any suggestions would be appreciated!
Hi both
Thank you.
Linda - It's people arriving at the Canada page who want to see all Canada, not the other way round. People select Canada as a destination but are also interested in our Arctic Canada trips.
The Canada page itself doesn't rank well or act as a landing page portal, however it is important in terms of site structure as people check that destination to see if we do trips there once they reach the site. People equally come onto the site looking for a trip to the Arctic as a destination so we do need both within the site in terms of the user journey.
The canonical tag would be my preference - if there is enough unique content on both pages do you think it matters if the holidays list is the same - this could be an alternative although we won't escape a percentage of duplication?
Hello
Just trying look at how best to deal with this duplicated content.
On our Canada holidays page we have a number of holidays listed (PAGE A)
http://www.naturalworldsafaris.com/destinations/north-america/canada/suggested-holidays.aspx
We also have a more specific Arctic Canada holidays page with different listings (PAGE B)
http://www.naturalworldsafaris.com/destinations/arctic-and-antarctica/arctic-canada/suggested-holidays.aspx
Of the two, the Arctic Canada page (PAGE B) receives a far higher number of visitors from organic search.
From a user perspective, people expect to see all holidays in Canada (PAGE A), including the Arctic based ones. We can tag these to appear on both, however it will mean that the PAGE B content will be duplicated on PAGE A.
Would it be the best idea to set up a canonical link tag to stop this duplicate content causing an issue. Alternatively would it be best to no index PAGE A?
Interested to see others thoughts. I've used this (Jan 2011 so quite old) article for reference in case anyone else enters this topic in search of information on a similar thing:
Thanks everyone. I thought that was the case but it's always good to get a second opinion!
Have a read of this: Google: Your Content In Tabs & Click To Expand May Not Be Indexed Or Ranked
I'd have a look at the source code and see if the content is readable. Also copy and past snippets of the "hidden" text into Google to see if it is being indexed.
Thanks again for everyone adding their thoughts.
The traffic decline seems to have come about since Thursday 4th December. We did well out of the Penguin 3.0 update, having previously been negatively affected (link clean up and disavow put in place earlier this year). Our ranking on important terms have dropped below their pre-Penguin 3.0 uplift now though.
Our keywords have continued to drop again today with several showing a loss of 7-10 places (on top of previous drops).
I did test the expanding panels and found that Google did seem to be indexing the content okay. I have tried making one of the reviews panels permanently expanded to see if it makes a difference though but still worry it just makes the page look very spammy as the keyword is the same as the item being reviewed, so is repeated numerous times on the page.
Any further thoughts?
Thanks,
Kate
Looks like your connection to Moz was lost, please wait while we try to reconnect.