Questions created by Clickmetrics
-
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Drop in traffic, spike in indexed pages
Hi, We've noticed a drop in traffic compared to the previous month and the same period last year. We've also noticed a sharp spike in indexed pages (almost doubled) as reported by Search Console. The two seemed to be linked, as the drop in traffic is related to the spike in indexed pages. The only change we made to our site during this period is we reskinned out blog. One of these changes is that we've enable 'normal' (not ajax) pagination. Our blog has a lot of content on, and we have about 550 odd pages of posts. My question is, would this impact the number of pages indexed by Google, and if so could this negatively impact organic traffic? Many thanks, Jason
Technical SEO | | Clickmetrics0 -
Mass HTTP to HTTPs move
Hi, As as part of an on-site SEO optimisation process, we've identified moving over from http to https - this is also in part to ensure our on-site forms are secure. In our industry our website has a high traffic volume (top 2 in the industry), we are concerned what impact the 301-redirecting from http to https would have on our organic traffic, both in terms of how Google would react to this mass-301 redirect plus the loss of 'search value' of inbound links. Privacy issues aside, would the minor quality-signal improvement be worth the move? Anyone have experience with such a move - was the outcome positive? Many thanks, Jason
Technical SEO | | Clickmetrics0 -
Handling pages that are no longer relevant (both permanently and temporarily)
Hi, We run a travel site with a number of programs, and each program has its own dedicate page, ie example.com/programs/program-xyz Some of these programs stop running and we no longer offer them, other-times they are on hold and will be reactivated later. Our old strategy was to 301-redirect these programs to another, relevant program. However, I believe that could be flawed. Would it not be a better solution to display the page as normal (with a 200 code) and instead of having the details of the program rather show some text saying the program has stopped and list a few suggestions. I just don't want to set off any spam-flags by pushing SE value via a 301 redirect to unrelated pages Here are some other scenarios I was thinking: For the program are only temporarily on-hold (ie not taking bookings for now) 302 redirect those to more appropriate pages For programs that are permanently on-hold (ie will never take bookings again) show a custom 404 or 410 page (With text with suggestions of different programs) Any suggestions or feedback on this would be most appreciated. -Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Staggered Blog Posting
Hi, My client has recently launched a new site - while the site was under development (A period of around 6 months) they build up a large amount of posts for their new blog. I have advised them against uploading all the posts in one go (ie from 0 to 100 in one day), as I'm sure this would be viewed with suspension by the search engines. My question is (and I'm not looking for a magic number here), what would be the best way to publish these blog posts, and what possible penalties could be triggered if we do it incorrectly. I do believe the content is unique and unpublished elsewhere. My suggestion to my client was to create a content calendar and set dates on which the various posts should be published. Further I suggested it is stagged or random, i.e. not every 2nd day - but vary it - so for example, 2 a day then a break for a day or two then one post, then the next day another 2 etc. Any thoughts from the Moz Community? Thanks, Jason
Content Development | | Clickmetrics0 -
Regional and Global Site
We have numerous versions of what is basically the same site, that targets different countries, such as United States, United Kingdom, South Africa. These websites use Tlds to designate the region, for example, co.uk, co.za I believe this is sufficient (with a little help from Google Webmastertools) to convince the search engines what site is for what region. My question is how do we tell the search engines to send traffic from other regions besides the above to our global site, which would have a .com TLD. For example, we don't have a Brazilian site, how do we drive traffic from Brazil to our global .com site? Many thanks, Jason
Intermediate & Advanced SEO | | Clickmetrics0