Site 'filtered' by Google in early July.... and still filtered!
-
Hi,
Our site got demoted by Google all of a sudden back in early July. You can view the site here: http://alturl.com/4pfrj and you may read the discussions I posted in Google's forums here:
http://www.google.com/support/forum/p/Webmasters/thread?tid=6e8f9aab7e384d88&hl=en
http://www.google.com/support/forum/p/Webmasters/thread?tid=276dc6687317641b&hl=en
Those discussions chronicle what happened, and what we've done since. I don't want to make this a long post by retyping it all here, hence the links.
However, we've made various changes (as detailed), such as getting rid of duplicate content (use of noindex on various pages etc), and ensuring there is no hidden text (we made an unintentional blunder there through use of a 3rd party control which used CSS hidden text to store certain data). We have also filed reconsideration requests with Google and been told that no manual penalty has been applied. So the problem is down to algorithmic filters which are being applied.
So... my reason for posting here is simply to see if anyone here can help us discover if there is anything we have missed? I'd hope that we've addressed the main issues and that eventually our Google ranking will recover (ie. filter removed.... it isn't that we 'rank' poorly, but that a filter is bumping us down, to, for example, page 50).... but after three months it sure is taking a while!
It appears that a 30 day penalty was originally applied, as our ranking recovered in early August. But a few days later it dived down again (so presumably Google analysed the site again, found a problem and applied another penalty/filter). I'd hope that might have been 30 or 60 days, but 60 days have now passed.... so perhaps we have a 90 day penalty now. OR.... perhaps there is no time frame this time, simply the need to 'fix' whatever is constantly triggering the filter (that said, I 'feel' like a time frame is there, especially given what happened after 30 days).
Of course the other aspect that can always be worked on (and oft-mentioned) is the need for more and more original content. However, we've done a lot to increase this and think our Guide pages are pretty useful now. I've looked at many competitive sites which list in Google and they really don't offer anything more than we do..... so if that is the issue it sure is puzzling if we're filtered and they aren't.
Anyway, I'm getting wordy now, so I'll pause. I'm just asking if anyone would like to have a quick look at the site and see what they can deduce? We have of course run it through SEOMoz's tools and made use of the suggestions. Our target pages generally rate as an A for SEO in the reports.
Thanks!
-
Glad to hear things are progressing, even if it is a bit slow.
Your pages would likely be indexed a bit faster if they were linked from the home page. In a more typical situation, if you added a new location to your site there might be a "New Locations" block on your home page. In this particular situation it is more of a matter of being patient.
-
Hi,
Just checking in to give a quick update.... November 1st, so a good 10 days on, and we're still ranked in Google, which is great. I still think I'll 'reserve' some judgment until a 30 day cycle has passed... but it's looking good.
...and the pages which Google has indexed are ranked reasonably. Generally in the top 4 pages. Obviously I'd like to see them higher, but it's a good start. The downside is simply that there are few pages indexed, and whilst we steadily work on adding content Google doesn't tend to add the pages in a great hurry, they do seem to take a week or so. But I'll see how it goes.
-
Absolutely. I think for now we'll just stick with adding area pages with new guides added. All original, unique. Hopefully as that builds up it will help maintain the site's overall 'quality' in Google's eyes.
Much could happen in the next 10 days.... but it's a positive sign anyway.
-
I am exceptionally pleased to hear this result.
You are absolutely correct to not get too excited yet. I would suggest you continue with your current plan for now. If after 10+ days passed your rankings remain, then you have likely resolved the problem and should just exercise caution in re-adding the additional pages.
-
Thanks Ryan.... and....
HURRAH! It looks like the penalty/filter has been lifted today!! At last. (Although I won't get too excited until another 7 to 14 days have gone by and I can see that we're remaining there, and don't get penalised again as in August).
But.... yesterday a search for "holiday lets in Beesands" came up on p27. Today.... it is at the top of p2! (Not quite p1 yet, but coming from where we were that's great news).
The downside is, having removed virtually everything from the Google index we have very few pages actually indexed that you can find, right now. But that was intentional in order to get a grip on the situation, remove all pages which lack enough original content, and to be able to monitor a very limited set of pages and their progress in the index. So, that (hopefully) achieved, the concentration can continue to be on building up the content and adding more pages. And obviously there is more to do on the site itself, but it's a good step.
Thanks for the help!
-
Fantastic to hear the progress you are making! Very sincerely so.
-
Ryan, I think you're reading a little too much into my responses. 'We'll review it' is not intended as resistance, but simply that we'll have a think about what to do in each specific case. For example, as you suggest regarding the footer, a sentence like "Now servicing California including Sacramento, Los Angeles and Beverly Hills" wouldn't be spammy. So the question was, should I replace what I had with some shorter, more natural text and links, or should I just bin it altogether?
So... regarding your most recent suggestions. If you take a look at the site now, you'll see that:
1. The links in the footer have gone. As I said, I want to review it a bit as it may need some re-arrangement etc. But for now I've just removed them all.
2. There is now a Cottages & Lets tab. This again may need tweaking, I'm not sure. Currently clicking it takes you to that search, but to the 'level' of your last search. So first time it is at the root, subsequent times it will remember your last search and take you there.
3. The issues with editing the URL are now fixed. ie. taking the town off and county off and just having /England now shows England lets.... adding a county shows that county etc. I think this is as you'd expect.
Onwards and upwards!
-
At the time I replied I hadn't seen the PS edit.
My only disappointment or negative tone related to what I felt was resistance to changes which are well within your control to make immediately. For example, the footer section of your site is very spammy.
If I add a footer to my site which says "Now servicing California including Sacramento, Los Angeles and Beverly Hills" which each of the three cities being anchor links to pages within my site, that would be fine. If I were to take the same approach listing a dozen or more locations, that would be spammy.
You have two spammy types of blocks in your footer. These blocks appear on every page of your site. I can understand that changes take time and so forth, but frankly removing those blocks should be a very fast and easy change and improve a negative quality issue with your site.
We both have the same goal, the proper re-indexing of your site. I view the changes necessary to improve your site as mostly items which can be taken care of relatively quickly. Of course, generating the unique content on each page will take a considerable amount of time and effort, but site wide changes of the footer, social icons, URL structure and so forth are generally fast changes.
As a general SEO principle, the important content of your web page should be above the fold (i.e. viewable immediately when the page loads). The footer is for items you have to present but otherwise don't care much about. Copyrights, Privacy Policy, Disclaimers, etc. The footer is not a good place to stuff links you want users to use.
Again, sorry if I came across harshly. My social graces often fall quickly as I move deeper into a site and my "Let's get it done...GO GO GO" spirit kicks in.
-
Thanks Ryan. I hope my PS edit didn't 'cross' with your post or sound negative.
I'm very thankful for all your input, and indeed, for even bothering to take time to look at the site and reply. You didn't need to and I appreciate it.
But I'm certainly aiming as high as I can with the site. I hardly have the resources of Google et al at my disposal, but great things aren't always achieved by mighty powers. It's good ideas, a little bit of ingenuity, originality, and steady, chipping away, hard work that often prevails. Being knocked out of Google can be quite depressing.... but I'm not cast down and consider it to be blip which ultimately will result in improvements to the site which might otherwise not have occurred. In the end it should make better.... because adversity works like that, you learn from it and end up wiser and stronger in the end. Hopefully!
-
Well said Ian. Please accept my apologies and thanks for correcting me in such a cheerful manner.
-
Not at all Ryan! You do me a disservice.
My goal is as high as yours, if not higher. My goal is not to make this 'A world-class property rental site', but to make it the best. Period. I see no reason for not aiming high with it.
But I realise that it's not going to be the best next week, that's all. It's taken time to get this far (which is a million miles from what it was last year, Google ranking aside) and it will take more time to get to where I want it to be. But big goals are achieved with many steps. You climb a mountain one step at a time.
I'm simply talking about the next step. I'm on the side of the mountain, it's freezing cold and I have a particular problem to solve. So whilst I intend to reach the top I'm simply focussed on solving the current problem. Once done I'll be able to move on to the next step(s) without this particular hinderance.
As I said before, there are many more issues on my list of TODOs than the ones you've identified (but your input is welcome). Those will get addressed one by one, and I'm confident of progressing. I'm not aiming to do 'the minimum'.... my question about the content had a reason behind it, which is how best to target resources. That could be to make the guides longer and better, or it could be to add more. At present I need good pages, and more of them. One fantastic page won't be enough. 100 poor pages won't help. 20 Good pages will do better. Once we have those, we can add 20 more. Then make 20 great ones, then another 20 good ones.... and so on. One step at a time.
But don't doubt the goal!
(Only, when you're currently down on p50.... to come on a forum boasting of making the best site out there is going to look a little like a squeaking mouse making great claims. First you walk, then you run.)
(EDIT...
P.S. I have to say however, that I don't quite understand why you have taken the tone you have here? I've heard all you've said, have embraced the positive changes you recommend, have begun to implement then and intend to improve the site more. I haven't disagreed or argued with you, and yet you seem to end on a somewhat negative note? My original question is specific to Google, not asking what needs making better on the site generally. But your responses are nevertheless helpful and welcome. So I thank you for them, but can assure you I'm not taking a 'miserly' approach to the site, simply a realistic one step at a time approach.
-
I agree with most of what you shared, except the last part. I have the utmost confidence Google is capable of processing any amount of data efficiently. The Panda changes are algorithmic changes, and I doubt anyone on the planet is as efficient at applying an algorithm to large data sets as Google.
I would also say you are much more confident about the quality of your site then I am. I recognize you have worked very hard on your site and I agree the small amount I have seen is much improved. I would classify your current site still requires major work.
We are viewing your site from different perspectives. It seems your viewpoint is along the lines of "what is the minimum amount of changes necessary to be indexed properly?" My viewpoint is, what changes need to be made to make this site a world-class property rental site? The biggest difference is, if you fall short of your goal the site remains unlisted for 30+ days. If you fall short of the standards I set forth, your site still would be listed but you would not rank in the top 3.
-
Ok, thanks for taking the time to read those and research it. As I said, I'm happy to be corrected but I was just going on what seems to be stated in a number of places... and in relation to what I've actually experience on our site. We've made changes but seen no changes yet, so either it is just the normal sort of 30 day cycle that needs to pass, or we await a Panda manual analysis (on a similar time-frame perhaps, so as you say it doesn't make much difference), OR we have had a penalty applied for a certain amount of time (90 days?).... OR we still haven't fixed the issues. I'd be suprised (now) if the latter is true, I believe the site should index better (p50+ is not poor ranking, it is a penalty/filter), so hopefully it's just a matter of time.
What we did have happen with the site is this though....
6th July a penalty was applied. Went from a good number of hits to a trickle.
(Made various changes seeking to address the causes)
5th August (30 days later) suddenly Google traffic came back. Carried on like that, well ranked, for 5 days..... then suddenly disappeared again.
Now, it is the fact that when 30 days expired that we 'returned' that indicates a 30 day penalty at that stage.... and the fact that 5 days later we disappeared again INDICATES (that's all) that perhaps a manual process was run again on the sites in Google's index and we got knocked under again.
It's speculation of course.... but it's important to have an 'idea' of what is going on so that I know whether I still have things to fix or whether the fixes made have been right and it is now a matter of time to see the results. Of course the majority of improvements are all good anyway (more content will always help) but some changes made could be detrimental and I wouldn't want to be going around in circles if the situation was simply at a stage where the main issue at stake was for some time to pass to see the impact of changes already made.
(And that all said... whether true or not.... It seems to be perfectly feasible and sensible that certain analysis by Panda WOULD be done periodically, rather than daily, simply because of the processing required.)
-
Thank you for sharing these articles. I have read them all. It seems the conclusions are based upon a confusing tweet conversation, and the statements of Matt McGee which seem to be his interpretations provided without the source.
Some official statements which I found in the articles you shared:
"Google rolled out a wholescale change to its algorithm last week". This change was the Panda update. The article also linked to the official Google Webmaster Central response: http://www.google.com/support/forum/p/Webmasters/thread?tid=76830633df82fd8e&hl=en
That response shared "According to our metrics, this update improves overall search quality. However, we are interested in hearing feedback from site owners and the community as we continue to refine our algorithms." There was a link provided to an official Google announcement of Panda: http://googleblog.blogspot.com/2011/02/finding-more-high-quality-sites-in.html
"But in the last day or so we launched a pretty big algorithmic improvement to our ranking"
I see repeated references directly from Google stating their algorithms have changed as a result of Panda. I trust any information from Google directly, Matt Cutts directly, etc. I have learned to be highly distrusting of other sources of information UNLESS the specific author is trusted or the author provides specific references to source information.
I trust Danny Sullivan but nothing in his article supported the idea you mentioned. I have reviewed numerous other sources and, as of yet, I have not found any direct support of this position other then the tweet which I am still not convinced has been properly interpreted.
http://www.youtube.com/watch?v=BaZtvm54r_M
I could be wrong here, but even if I am, the bottom line is any site affected by Panda or otherwise is not providing their content in the best manner possible should make corrections asap. Doing such will likely resolve any Panda or other issues. The only question is how quickly the site will recover. It seems Panda changes have been coming almost monthly which is about the time it takes Google to crawl all the pages of average websites anyway.
-
Ok will do...!
Here's a link to an article about Panda and daily/manual application. I think it links to other sources:
http://www.seroundtable.com/google-panda-manual-13387.html
And this page is probably a good 'spring board' to lots of info about Panda:
http://www.stayonsearch.com/complete-guide-to-the-google-panda-update-50-articles-resources
EDIT: Here's another....http://searchengineland.com/coming-soon-google-panda-update-2-2-80848
In which it is stated... "
Cutts also said there have been no manual exceptions made to sites that were wrongly affected, but there have been recompilations of data that may have helped some sites.
In particular, he noted that the Panda algorithm is run against Google’s entire index of pages on an infrequent basis, in order to tag certain sites that should be dinged by it, as opposed to some of its automatic spam detection tools.
For example, Google’s constantly scanning for pages that might use hidden text. If it spots them, then it may assess a penalty.
Google is not constantly scanning for pages that might get hit by its Panda penalty. Instead, Google manually runs that algorithm, which then determines web sites that should be hit by it.
This also means that making changes to a site hit by Panda won’t produce any immediate change in Google. Instead, such changes — if they are beneficial — wouldn’t get registered by Google until the next Panda assessment.
When is Panda run? Google didn’t say. But it seems to be something that runs every few weeks and in association with when the algorithm is improved (Panda 1.0, Panda 2.0, Panda 2.1, Panda 2.2, etc.)"
That seems to state the scenario I mentioned. So I've stuck waiting until the next Panda assessment according to Mr Cutts himself.
-
I would appreciate reading any information from Google regarding the Panda information you shared.
The only other comment I will share is regarding item #4 you mentioned, the footer links. Please take a look at your analytics to see if your site's visitors are actually using those links. If a link is provided which is not being used, it should be either modified so your visitors start using it or removed.
The use of footer links in general is quite low. The use of footer links which are lists of dozens of cities or location names and appear in the manner as presented on your site is normally exceptionally low or not at all. Try checking to see if any of your site's visitors (not counting you or employees) have used those links.
-
Thanks again Ryan, all good and useful points, all of which we'll aim to fix (there's plenty more I could list but as with any software development it's all a case of prioritising what needs doing and working on the most important things first.... the site has come a long way in a short time).
So... agreed, it needs a 'cottages-lets' (Rentals, or whatever) tab to relate to that search. We know that, there's a reason why it hasn't arrived yet, but it will soon!
Secondly, I take your point about the URL structure, but the missing 'cottages-lets' bit aside, I think the structure makes sense. I think the country/county/town folder arrangement is logical.
Thirdly, I take your point about editing the URLs. I'll ensure that just having /England at the end (no county or town) returns the right results. We'll fix that too.
Fourthly, the footer links.... yes, see your point. I actually find them useful so I'm not 100% sure I want to remove them all, but certainly they could be reduced, and/or worded better... and the guides given as much prominence as the holiday let search.
Fifthly, the social icons. I'll have a think about this one. We do now have the floating bar at the side which achieves this aim.... the bottom ones were left because.... well, they look quite nice actually. Anyway, we'll review it!
...
Now, regarding your point about Panda and how it works... I have to say that that is not my understanding based upon a number of things I've read on the web about it, including quotes from Google employees (Matt Cutts I suspect). I imagine that Panda is a framework comprising changes made to the general Google crawling/indexing process and algorithms (ie. those things which happen 'daily') and also some algorithms and processing which is applied manually and periodically.
I believe (from what I've read) that the 'thin content' analysis (which may involve tools similar to CopyScape) is a time-consuming task so it is not run every day against websites, but is run manually. From what I've heard Google state, it is something they run every month or two. THEREFORE if it is that which has been applied to my site and a penalty applied (as I believe it has) then I won't escape the penalty until such time as that process is re-run against all the site's pages and the assessment of 'thin content' is reviewed. I believe the indexed pages now are rich enough to 'pass the test'.... but the penalty remains until the day when this process is run.
If I'm wrong about that I'm happy to be corrected, but it is certainly what I've heard and read. I'd have to dig out URL sources to prove it but you can probably find them if you search on Panda etc.
-
Hi Ian.
Regarding your URLs, I have some questions. There is often some alignment between the navigation bar and the URLs. For your example URL I don't understand how it works: http://www.go2-holidays.com/cottages-lets/england/devon/woolacombe. There is not a "cottages-lets" option in your navigation bar. If I was on your home page, what is the first click I would make to get to your /cottages-lets page?
My second question is whether /devon/woolacombe is necessary? It seems the two would always go together. Could you ever have woolacombe without devon? If not perhap /devon-woolacombe would be a better URL removing an unnecessary level. To that end, perhaps the England level can be removed as well. It depends on how you display your other pages.
It is very confusing with your current set up. I would have expected the /cottages-lets page to show the most options and the number to be filtered down as we move deeper into the URL. If I am on the /woolacombe page and I try to back up levels by modifying the URL, the other pages show only 1 or 0 cottages. The URL is not synched with the drop down selection boxes. This issue should be resolved.
In short, the URL design needs to be a carefully planned out part of your site. It helps users and search engines navigate and understand your content. I have the impression it is more a quick afterthought.
Regarding your content, please understand Panda is not a tool which is intermittently applied. It is a permanent adjustments to the algorithm used to rank web pages.
With the above in mind, your page has enough content. The next question is, whether that content is well written enough. I would suggest locating a grammar tool and applying it to the text at the bottom. Also, I would advise using some interlinking within the text. When you mention Croyde, for example, if you have a Croyde page on your site use the term "Croyde" as anchor text linking to that page.
Also, at the bottom of ever page you have a footer which is duplicated throughout your site and is simply loads of keywords. I would recommend removing it. One block is titled "Holiday Lets, Lodges & Cottages in Cornwall" and the other block is titled " Holiday Lets, Lodges & Cottages in Devon". I suspect if you were to perform some analytics you would find those links are very rarely used. They look spammy and simply add duplicate content to every page of your site.
A final suggestion. The social icons in your footer would probably be a lot more useful if they were relocated to the big empty space in your header.
-
Hi Ryan,
I wonder if you might be able to help with a couple of questions.
Firstly, we've begun the process of changing our urls. All the search and guide pages now use new style lowercase urls with no spaces etc. The actual property detail URLs are still old style (soon to change).
I've been quite bold in having Google de-index virtually everything, just to get a clean sheet of paper to work on. As the URLs were all changing I was able to have them remove the old 'directories' in one quick shot. Whether that was 'wise' or not I'm not sure, but I couldn't see the downside. Anyway, they've done all that and now we only have a few main pages in the index, and certain ones that we're making available (by not being marked noindex). This will mean I can watch over what's going on and hopefully ensure that only the richest pages make the index.
So.... a couple of questions. Firstly, take a look at this:
http://www.go2-holidays.com/cottages-lets/england/devon/woolacombe
This has eliminated any external duplication and has our own area guide on it. Do you think this would 'pass the Panda test'.... ie. be unique enough? Or is our guide going to need to be much longer?
Secondly.... given that we appear to be under a Panda 'thin content' penalty, and Google only run the Panda tool periodically (as far as I understand) does that mean we'll be stuck under the penalty until the Panda analysis is run upon the site again (which could be only every month or two)?
Thanks!
Ian
P.S. Oh, there was a third question! We have comboboxes for countries/counties/towns. As with many similar comboboxes on websites that means that a list of places will be embedded in the HTML.... and when I run Copyscape on the page I get several matches for that like this:
Bath | ... Bath | Bedfordshire | Berkshire | Berkshire | Bristol | Bristol | Buckinghamshire | Buckinghamshire | Cambridgeshire | Cambridgeshire | Cheshire | ... and Teesside | Cleveland and Teesside | Cornwall | Cornwall |County Durham | County Durham | Cumbria | Cumbria | Derbyshire | ... North Yorkshire | Northamptonshire | Northamptonshire | Northumberland | Northumberland | Northumberland | Nottinghamshire | Oxfordshire | Oxfordshire | Rutland | Shropshire | Shropshire | Somerset | Somerset | South Yorkshire | South Yorkshire |Staffordshire | Staffordshire | Suffolk | ... South Yorkshire | South Yorkshire | Staffordshire | Staffordshire | Suffolk | Suffolk | Surrey | Surrey | Sussex | Sussex | Tyne and Wear | Tyne and ...
Is that going to be a problem? Or would Google disregard that sort of duplication given that it isn't 'real' content?
If it could be a problem I'd rather investigate ways of keeping that text out of the main HTML (ie. generate it via AJAX when required).
-
Thanks Ryan. The reviews pages follow an approach whereby we check if there are any reviews for a property and if not we mark the page NOINDEX. If there are reviews then it is indexed. But as you rightly point out, that may be fine for those properties with a number of reviews, but in the case you cite there are only a couple of short ones, hence not as much unique text on the page as we'd like.
I think your suggestion of combining the two may well be a solution. It would certainly merge what is unique from both pages into one. For now, I may hold off on that though considering that we've noindexed all the main property pages, but if I can see the site get to a recovered position then it may well be worth merging detail and review pages before submitting detail pages for indexing again.
Finally on the URL side.... the pages all currently work fine if you type in the URLs in lowercase. The code deals with working out what the casing of the content should be. So there are no 404 problems. The main issue is simply what we actually use as our links and what is submitted to the search engines for indexing. So we'll move towards lower case, no spaces, shorter URL soon.
-
Ian,
You are absolutely correct with your meta tags. My tool displayed the general value for the meta robots tag and not the google specific value.
Your approach is sound. I have verified your property pages are not indexed. A simple google search for "site:www.go2-holidays.com" shows 470 pages indexed.
My concern is your review pages have hardly any unique content on them. An example: http://www.go2-holidays.com/HolidayCottageLodgeOrLetReviews/Derbyshire/Nr Tissington, Ashbourne/HOS-E4848/The Drying Mill
You could consider combining the reviews with the property page. With only about 100 words of descriptive text on the property page, and less then 100 words of unique text on the review pages, it seems like a sound match. It can also improve the user experience and help to resolve your unique content issues.
Good luck.
PS. In response to your latest reply, using capitals in a URL does not conflict with any quality guidelines. It is more of a usability issue. It is easier to remember a url where you can simply type naturally without remembering which letters to capitalize. It also saves from confusion as to why a page is offering a 404 error. Using all lower case is a best practice for web design, not from Google.
-
Little PS.... just as a bit of a 'sigh'.....!!
Try a search on 'holiday lets in bourton on the water'. On page 2 you'll find this result: http://www.holiday-cottages-in-england.com/Cotswolds/Bourton_On_The_Water.htm
Now.... how does that site stay there? It's all just properties from one other affiliate company. It has capitals in its URL... very little original text etc. But no penalty..... I don't 'mind'.... just don't understand why we're struggling in comparison!
-
Thanks Ryan. That's all very helpful.
Regarding the Holly Bush page and it being marked as noindex, the meta tags within it are:
id="ctl00_MasterPageMetaGooglebot2Tag" name="googlebot" content="noindex" />id="ctl00_MasterPageMetaGooglebotTag" name="robots" content="index, follow" />
ie. we have the googlebot tag and the standard robots tag. The googlebot one is marked noindex, not the other one. The reason being that Bing currently has no problem with our site so I didn't see the reason is keeping these pages out of Bing... only googlebot.
I think the tag combination is working correctly because Google now only has 200 or so of the property pages left in their index (from a total of thousands previously). So I don't think we yet need to mark robots as NOINDEX also.
On your suggestion...
1. Yes we have a sitemap which is submitted to Google. (And we don't use robots to prevent crawling, we use noindex, thus ensuring that the noindex pages are crawled and then removed from SERPS.)
2. Ok - will do.
3. Yup.
I think in many respects the plan you suggest is the path we're already on. We simply haven't seen the penalty removed yet. What I think I'll do is give it another 30 days (not sat still mind you.... but to be reviewed again in 30 days time), and if there is still no change then I may move on to plan B which is....
...to remove ALL the pages, and then just selectively start submitting single pages back into Google. ie. hand pick individual ones which we know ought to be fine and see if they rank at all. I'd much rather have 10 well ranked pages than 10,000 poorly ones.
My only concern with 'plan B' is whether removing all the pages and then only have a handful in the index would actually send out a negative signal to Google which would damage the rankings further. eg. would having more pages be beneficial? This is the trouble with the whole process.... everything feels very delicately balanced. You fix one area but that can inadvertently break another. Just the act of making changes to pages can send signals to Google.... and indeed if traffic to the site acts as a positive ranking signal then losing traffic (we we have done) would appear to act as a catalyst towards a downward spiral!! How do you possibly get back again when your traffic has dried up (at least via Google)?
...Anyway, thanks for your time Ryan and your help. I appreciate it. We'll get there.... because I think the site serves a need, and if lack of content holds it back then we'll simply keep adding it. The intention is not to be an 'also ran' or 'make a quick buck', but to make a great site... so if we can get there, Google ranking would (I hope) naturally recover.
-
Regarding the first item, the URLs, it sounds like you intend to update your site in the future which is great.
The second item, using the page's phrase within the content, is an important SEO consideration which I would strongly recommend being used 100% of the time. With that said, you are right in that the pages which outrank you, including the #1 ranked site, often do not use this best practice. I can't comment on that much further then to say SEO is purely a competition. My strategy has always been to take every action within my control to improve results. You can choose to skip various steps as you deem fit. Keep in mind there are numerous significant metrics which Google uses to rank pages to which we do not have visibility. This challenge only emphasizes the importance of maximizing all the metrics within our control.
The third item is duplicate content. I had examined 3 pages on your site and all were the property pages. I simply chose the first 2 properties, and then a third property which had the same title as one of the selected property but was actually a different listing.
My concern is that in your latest example of a "fuller" content page (/Holly Bush Countryside Cottage) there are 126 words in the description. Even if I include the sidebar content the total unique words on the page is 158 out of 1203 words. 87% of the page is duplicate content. It's simply far too much.
With the above noted, it sounds like you wish to focus on your search pages, not your property pages. You mentioned that you added the noindex tag to your property pages. I am not seeing the tag implemented. I have checked the Holly Bush page mentioned above, and the David Carr page mentioned previously. Both pages are marked as index, follow.
The bottom line, your site is very likely experiencing a site wide penalty. The root issue is a very high percentage of your indexed pages are duplicate content. My suggestion is to get control of the situation and begin adding pages back to the index.
Step 1 - create a sitemap for your site listing every URL.
Step 2 - review the sitemap and decide which pages you wish to be indexed. Any page which is not ready to be indexed for whatever reason should have the noindex tag applied.
Step 3 - review the remaining pages. Ensure they are unique pages by minimizing any duplicate content.
If the above plan is implemented properly, Google will notice the changes as they crawl your site. Do not use robots.txt to block your pages as this will prevent Google from viewing the page changes such as adding the noindex tag.
I expect it to take a solid month for Google to crawl all the pages on your site. Once the entire site is crawled, Google will then recognize your site complies with their guidelines. At that point it is likely your penalty will be immediately lifted. It is possible you will be kept in a "timeout" period where the penalty remains for a bit longer, but I would expect the issue to be resolved within 30 days.
-
Here's another thought...
As I've said, we made the property details pages NOINDEX for googlebot. The idea being (as suggested by another SEO bod) that it would avoid any duplication issues between the descriptions shown on the search pages and the detail pages. Either we focussed on the detail pages or the search pages, but best not to do both, was the suggestion. So, details pages are NOINDEX.
However.... how might Google's algos then view the search page? If that has 10 links to details pages which are marked as NOINDEX, could that be considered odd/suspicious? Does that make the search page appear like a doorway, given that the links out of it are heading off for noindex pages?
If that is possible, perhaps the links out of the search page should be marked as NOFOLLOW?
Or perhaps none of that is really a factor...
-
Hi Ryan,
Thanks for your further reply. You're right that Google is a private company with the freedom to set their own guidelines for inclusion in their index, and, as I said, I do of course respect that and realise that those are the rules I have to play by in order to be indexed (rules, I might add that I've always sought to follow, but they simply changed the application of the rules at some stage this year, hence what originally passed the algorithmic tests now fails alas...).
However, although a private company Google is a private company like no other. It's huge growth and dominance of internet search has put it into a position of being something of a gatekeeper of the internet and as such it wields great economic power over many, many, companies, and indeed rivals. As such it is very much in the position of being a monopoly and is subject to legislation regarding the abuse of such a position. It has a responsibility to 'play fair' and if its algorithmic changes can demonstrably be shown to be detrimental to rivals and favourable to its own sites and products then it could place itself in an awkward position with regard to anti-trust laws. So I don't think we can just say 'oh well they can do what they like', or indeed close our eyes to the intent of any changes they make.
They of course have freedom to continually work on their systems of indexing and to attempt to 'root out' spammy sites... but when legitimate sites become the victims of such attempts their owners have reasonable cause for complaint. The trouble is that there really appears to be few avenues for lodging such complaints or for gaining any visibility on the causes of penalties and filters, for example. I know that Google are reluctant to provide diagnostic info on penalties as they don't want to aid spammers' attempts to get around them, but why can't they use digression about providing information? For example we have already filed manual re-inclusion requests and received responses that no manual action has been taken on our site. Given that a human has taken a look at the site, why, in such a situation, can that Google employee not make a decision on whether to 'press a button' in order to include diagnostic info in the standard reply sent? e.g. that reply could state that a site has violated the guidelines because of hidden text, or lack of original content. (The point being that this needn't be sent out in all cases.)
Anyway.... that's the 'politics' again.... back to the specifics.
Of course we want to make whatever improvements to the site we can in order to improve ranking (and get the filter removed). So...
#1 Yes, we'll look into altering our URLs.... lower case, no spaces, shorter etc. This should be a straight-forward change. Most of the pages work fine regardless of the case of the URL anyway.
#2 I'll ensure the targetted phrases are used in the body of the text. Again, shouldn't be a big change. But, as I noted before, this problem seems to relate to the property detail pages and currently these are NOINDEX for googlebot anyway. So any discussion on the problems on these pages is somewhat by-the-by for the present as we're not trying to get them indexed. We could of course change our minds about that, but for the time being I'd rather concentrate on locations.
But, before I go on, I think I should say that you seem to have looked at mainly those properties we have with shorter descriptions. I can assure you that they aren't all like that! It really depends on the individual owner (how much description they provide), or manager (for collections of properties from one company). Some provide more extensive information than others. In general terms I could say that we have 17,500 properties..... at least half of that number have very full descriptions.... another 25% are probably 'medium'... and then the rest are shorter. Some examples of fuller descriptions (and multiple pictures) might be:
Or:
...
But, it's the location pages we want to see indexed (whether they be the search pages or the guide pages). I take your point about 'search' pages, but as I said in my last post I believe Google's main 'target' in their mention of search results in their guidelines is those sort of pages which contain scraped search results (or very similar) from Google. Vertical search content of data that cannot otherwise be found on a general Google search, ordered and sorted as we do is really a different case. Google 'may' be against these sort of pages.... but I just don't believe that is borne out by the evidence you see in their own SERPS from other holiday rental companies. The majority of landing pages listed are search result pages (and I believe that is what users are looking for when they google for 'holiday lets in....'), so if everyone else can get these pages indexed I see no reason why we can't. I believe our penalty isn't down to that factor, so if we can fix whatever other issues hinder us then we should be OK. As my posts on Google's own forum show we have made various fixes which may relate to the cause.... and it may just be a matter of time now for Google to remove a filter. Hard to say.
But.... that, again, is my real question here. I value any input of the actual SEO and structure of the site of course, but I also am looking for any knowledge based on experience of Google's applications of penalties. We clearly don't have a manual penalty in place, but an algorithmic one.... but the question is will that be lifted as soon as pages are fixed, or can it be for a set time? I really think, from what I'm seeing, that the latter is the case. In other words I could now create the perfect page and ask Google to index it, but currently it will still be down on page 40+ in SERPS because a filter is hanging over the site until however many days have expired.
We've made a number of careful observations of the effects of the filter and it does appear to be that the major keywords on the page are devalued. What that means is that a search for 'holiday lets in TOWNA' won't show that page until about p40+ in SERPS.... but if our TOWNB page actually returned some results for town A because they are nearby, then our 'holiday lets in TOWNB' might actually rank much higher in the SERPS for a search on TOWNA. This shows that Google is picking the words up in the body text fine, but just devaluing the words it highlights as keywords in the title and description. This seems to be getting applied throughout the site, regardless of the page and its content (ie. pages with lots of unique content fare no better than thinner pages at present.... because the keyword devaluing still affects them).
So it does seem that there is a time-span to this filter and all we can do in the meantime is try to ensure that the site is in the best state so that when that time expires no more problems are found with it.
Thanks again!
...
P.S. Thanks for the observations about oddities you noticed in searching.... I'll look into that. It isn't something I've noticed before and if it occurs it is clearly a bug. Probably some trivial cause.... we'll sort it out!
-
Hi Ian.
I understand your position on Google, the Panda update and their quality guidelines. There is so much I can say in response but the bottom line: Google is a private for-profit company. They offer to send organic search traffic to your site at no charge but require you meet certain guidelines they set forth. If you do not wish to participate, you can easily opt-out via the robots.txt file and other means. The reality is, businesses depend on Google so you and I along with everyone else need to understand their rules and work with them to maximize the exposure of our sites.
Regarding item #1, the URLs, it is normally not a large ranking factor, but it is a ranking factor and in a competitive industry I strongly advise you to do everything possible to improve your rankings. A change in your case might make a larger difference then most sites as there are so many non-standard corrections to be made. Additionally, URL appearance can affect click-through-rates.
Regarding item #2, including a page's target keywords in content, I checked multiple pages of your site and noticed this issue. The David Carr link I mentioned above was one example but all the similar pages I check had this issue. The only inclusion of the phrase was in a standard "For a review of the David Carr.....click here". This phrase was not in the original paragraph, it was separated from the rest of the content, and it seemed very "canned".
Regarding item #3, here is a link from the #1 ranked site to one of their pages which is equivalent to the David Carr link from your site: http://www.holidaycottages.co.uk/devon/north-devon/gracious-ford-cottage. This page represents the type of page Google believes searches would like to see. Please specifically notice the page contains what appears to be rich, custom and unique content specifically related to this listing. Also note the clean URL and multiple pictures. This site ranks #1 for the term "holiday lets in Devon".
This particular page targets the term "gracious ford cottage" and ranks #1 for that term. Of course, there is zero competition for that exact phrase, but it also ranks #10 for the term "ford cottage". There are improvements which can be made to the page. For example, this page also fails to use the term "ford cottage" within it's content. If it did, it would likely rank even higher.
The final point you shared is about showing your search result page in Google. Generally speaking, Google prefers not to show "search results within search results". I also noted a few oddities during my brief visit of your site. The same page changes upon refresh. Using the Beer, Devon example I loaded the page on two separate pc's. I tried to find the same David Carr result from pc #1 on pc #2 and was unable to do such. There were about 50 results and I scanned them multiple times and it was not there. The order and selection of cottages seems to change each time. I can't say this would affect rankings other then it did not create a user-friendly experience for myself.
I recognize your viewpoint. You feel you are presenting your site in the best manner possible for your users, and you do not wish to change your presentation to satisfy a search engine. Well, you don't have to...unless you desire to be ranked higher by that search engine. You can leave your site as-is and pay Google for AdWords and generate your traffic in that manner. Otherwise, your site presentation will need to radically change if you wish to find it's pages listed higher in search results.
Good luck.
-
Thanks Ryan. I really appreciate you taking the time to have a good look at the site and to reply in such detail – and I value the input and advice you give. Your opening sentence, that “from a SEO perspective, your site requires a tremendous amount of work”, I can imagine will line up a number of emails in my inbox from prospective SEO companies....! At least someone will appreciate what we’ve done!!
Joking aside, do you mind if I address your points in reverse order? I think that will take us from the general to the specific.
Regarding the point that Google has specifically aimed to penalize this type of site, especially with the Panda changes, I quite recognise and understand that fact. Our site has been around a while and was indexed well in Google for quite some months. They had over 40,000 pages indexed as it happens and many would rank in the 1<sup>st</sup> three pages of results (not fantastic, but OK and it delivered some good traffic). Clearly, come early July, something changed, some new algo was introduced and some scan of our site found it wanting (or perhaps we altered something and broke things, but I don’t think so...). It’s been mentioned by others that this is all down to the need for ‘rich content’ and that there perhaps hasn’t been enough of that on our site, perhaps too much duplication, and hence the penalty/filter.
Well, that may be the case, and I quite understand Google’s thinking..... but I have to say, I fundamentally disagree with it! Not altogether, but certainly in terms of how it applies to our particular market sector. We all know what spammy sites are, and how you can search for something and find some sites which simply list a bunch of similar search results, all bundled up with lots of advertising etc. Obviously Google’s mission is to get many of these sites out of its index and promote the sites which really deliver useful content. All well and good. So they’ve come up with various algorithms to that end. The trouble is that internet search is very ‘broad’ in its scope, and what applies to one search term doesn’t necessarily apply to others. I really think it’s a monumental task to devise algorithms which work for all cases. For example, if I did a search for ‘types of cancer’ I’d probably be hoping to find sites with article pages which discuss the types in some depth – so I’d hope Google would deliver such sites in SERPS. However, in our market people search for things like ‘holiday lets in Devon’ and what they are actually wanting to find (I believe) are sites that present to them a wide range of holiday lets in that location which they can search, filter (by price, features etc) and read about in order to find a property which is available, at the right price at the right time. The thing is, the sites which deliver that necessarily have landing pages with lists of properties, which shortened descriptions, which Google’s ‘panda’ could consider to be ‘thin’ or of ‘low quality’. Yet I’m convinced it is exactly what users are looking for....
...as it happens, the reason we made our site in the first place was to fill a gap in the market. Fed up of jumping from one holiday rental company’s site to another’s, endlessly searching on the same locality on many different sites, we wanted one site which presented to us all of those selections in one place, in which we could search and compare. When we built our site, there actually weren’t many sites like that, and there still aren’t.... and yet it fills the needs of many users. The trouble is, such ‘vertical search’ sites tend to be targeted these days by Google, even though Google own search is ‘horizontal’. It simply doesn’t provide the specific tools you need to search that sort of data – such sites do. It’s common knowledge what happened to foundem.co.uk and you can read their view of things by visiting their site, but again, they offer search and comparison (in various fields) and could be said to be ‘thin’ on content..... but in reality they aren’t. Their content is bringing disparate content all alongside each other within one page. Which is what we do with UK holiday lets.
Is a site like ours thin? Well.... if it is, then what of Google’s SERPS? Google’s search results are entirely crawled from other sites, all the text is pulled in from outside and they offer nothing much in addition other than their ordering and presentation of those results. AND THAT IS THE POINT, because as Google themselves say, that is the unique value they provide to the user. The indexing, sorting and presentation of that data. And that sort of value is exactly what a site like ours does. It isn’t a question of having lots of our own, written-in-house, blocks of original text. The real value lies in the range of properties we have on offer and the search, sorting and comparison (and reviews) which we offer. In order to satisfy Panda we can (as we have been) work on expanding our own textual content.... but really that isn’t what our users primarily want. Yes, they do value area guides (which we have), but their 1<sup>st</sup>, primary, reason for coming to our site is to search for holiday lets. We aim to provide as extensive information about each one as we can, but given that each property links to the original owner’s or manager’s own site even that isn’t ‘essential’..... except for Panda.
So what we face is the need to work on our site in order to satisfy a search engine.... which seems wrong, because Google themselves say to design sites for users, not search engines. And yet our user-centric site fails the Panda test. Some may ‘dispute’ that, but I believe that to be the case. I think the thinking behind Panda is wrong when applied to certain market verticals.... including our own.
But, I realise perfectly well that this is all simply an expression of my opinion and won’t get the site indexed on Google. Google have their rules and to get indexed you need to follow those rules. I know that, and I’m happy to work to oblige (whether I agree with the principles behind them or not). But.... if it IS that Panda considers our site ‘too thin’ I can’t help but wonder what is really going on... because if you search for ‘holiday lets in Devon’ (for example) and actually visit the top 50 sites listed by Google you will find numerous sites that deliver property search results as we do, with relatively little content. You will find many others which are poorly presented affiliate style sites. And these are all there, not demoted. Some are established sites, some have many backlinks etc, so some have reasons for being there... but even so, there are many examples which are doing nothing obviously ‘better’ than our site, and yet which retain their ranking.....
Anyway, that’s the ‘opinion’ out of the way.... excuse my going on, but I think it needs saying. Now to the other points raised... Regarding point 3...
3. What you say about the property details pages, is in many cases true. There are of course repeated sections on each page (booking info, availability, location etc)... but then that is logical. However, I don’t think I need to discuss the merits of these pages and Panda because we have in fact marked all the property detail pages as NOINDEX for googlebot, and Google has by and large removed the majority from their index. We used to have them all indexed, and we used to get long-tail traffic from them, but really it is the location search pages (HolidayCottagesLets/England/Devon/Beer) which we want indexed. Why? Because that is what users search for and want to find. They don’t especially want our 17,000 detail pages listed in Google, they want the location pages from which they can use our own tailored searching and sorting to compare the properties and then examine in detail on our site.
So, the real question is what troubles Google about those pages.... eg. HolidayCottagesLets/England/Devon/Beer. I know these are search results, but they aren’t ‘horizontal’ search results (like Google SERPS scraped and shown on our page) but vertical search of our own data. They are the pages users want to find via Google.... as demonstrated by the fact that the vast majority of rival sites which google returns in SERPS have similar landing pages (locality search). Is the content of these pages too thin? Well.... I don’t really think they are, certainly not in comparison with other sites. In addition to the 10 properties you get on the landing page there is also location descriptive text, which is our own text.
We also have guide pages for localities (Guides/England/Devon/Beer) which have even more information. Yes, there is repetition, but again there is lots which is useful here, including the location specific links to other sites which makes researching a particular place very easy.
It is these pages we’re trying to get indexed again, and have therefore made the property detail pages NOINDEX in order to avoid duplication between the two (where similar portions of text are found on both). Bing ranks these pages highly and I’d hope that they have sufficient content for Google to. At the moment the site is still filtered.... but that’s my question. Is that because there is a TIME penalty on our site which we simply have to ‘sit out’ or will it automatically fix when we fix any given page (ie. Get enough original text on a page and it may then start ranking). I can’t work out whether Google penalise individual page URLs or the whole domain.... but from what I can see it ‘feels’ like the whole domain is suffering, even if not every page is at fault.
2. I take your point here, but on that particular page that exact title is the title used in the h1 name of the property at the top of the page, and it is repeated two or three times on the page, I’m sure. The site specifically puts the property name (and location) in the page TITLE tag, the meta description, the h1 tag and the body..... it SHOULD all match up. But I’ll check.
3. Thanks, your comments on the URLs are helpful. I’ve seen that mentioned before (not too long, all lowercase) but I didn’t know to what extent it ‘mattered’ other than being ‘expected’ by users. We’ll look into changing things, but I’d be amazed if it was a great factor behind the filtering/penalty? Perhaps it would hinder rankings a bit, but surely not to that extent? Anyway... thanks for the tip.
...
Sorry for the long reply... and for the ‘opinion’ at the top, but I just think it needs expressing. The idea that a page’s worth is entirely based on the amount of original text on it just doesn’t entirely cut it with me. What if our site was a photo gallery? Some things just need ‘different’ rules.... the worth of our site is in the comparison it provides. I say that not because I work on it.... but I work on it because I saw the need for a site like it. I saw the worth of it, made it, had it ranked on Google.... then they decided ‘no we don’t want that’. The folk who really lose out are the users, because I think Google’s wrong – this is the type of site they are searching for.
Thanks again! Any other help appreciated!
-
Hi Ian.
From a SEO perspective, your site requires a tremendous amount of work. I realize your site offers a nice appearance and you have put a lot of effort into the site. I can't speak to where your site ranked previously but based on your history it's a good guess your site was negatively hit with Panda-related changes.
Some issues I noticed:
1. Your URLs are very long, contain extra info which makes them seem unfriendly, use capitals and also contain spaces. Example: "http://www.go2-holidays.com/HolidayCottageLodgeOrLet/Devon/Beer/HMA-732014/Linked cottage attached to stunning David Carr property". The %20 represent spaces.
Use dashes in your URLs to represent spaces, replace any capital letters and consider friendlier URLs such as http://www.go2-holidays.com/devon/beer/linked-cottage-attached-to-stunning-david-carr-property
2. I examined several listings. Your page title and header text is not used in your content. Each web page should focus one topic. Your title tag and header should represent the topic. It is very important for your content to support the topic as well. Using the "Linked cottage attached to stunning David Carr property" example, you actually need to write those words on your page, or a very close variation. If you are not willing to do such, consider changing the title and header.
3. Your pages have very little unique content to index. Step back and look at your pages. Again, using the same page mentioned above, the header and footer are naturally the same throughout your site. Your "Bookings and Enquiries" section is also the same on each page. The "Offers and Discounts" section is repeated throughout the site. The "Availability section is mostly the same on each page. The "Location" block contains a large amount of content and is the same for all the pages related to each location (i.e. Beer, Devon).
"Location: On the edge of Beer Village within 10 minutes walk of the beach. Accommodation: Entrance hall. Spacious sitting/dining room, period leather furniture, formal dining table and chairs featuring full length Georgian style glazed doors to pa.."
Above is the only unique content on the page. More then 90% of the page's content is duplicate. Also, this unique content is cut off. This is consistent throughout your site. The one unique section ends in "..." This portion is the most important part of the page and I don't understand why it would be cut off.
There are other issues as well. I viewed your site from two separate pc's. I noticed when looking at the identical URL with the David Carr property, the description and image was different. I checked the URL and noticed both pages are titled "Linked cottage attached to stunning David Carr property" but one of them had id HMA-732013 while the other id ended in a "4". Both pages have the exact same title, header, etc.
Google's algorithms have been specifically designed to penalize this type of website. Users want to see unique web pages with rich content. By Google's standards, this site is low quality. The algorithmic penalties are warranted.
-
P.S. I might add that Bing ranks us just fine! Whatever triggers a problem on Google has yet to be mirrored on Bing/Yahoo (thankfully).
Eg., on Bing search for 'Holiday Lets in Beer'..... currently no 1.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site architecture? I've got a free user report, that shoots back a page with their data for them to share with co-workers and friends.
Hi, I have a site about to go online that users can run a free report that connects to their calendar app to get 12 months of statistics for their meetings, and then it shoots out a report. So they go to a.com/freereport and they get back a.zom/freereport/report/xxxxxx The content of those reports is different, but the structure is the same as it is a fun way to show off meeting stats to co-workers and friends. I don't see the point of Google indexing those as the traffic to those pages is going to be from social networks and viral, but I do want the backlink credit. Will I get backlink credit if I nofollow that folder? I am having a hard time deciding what to do seo wise and would love some thoughts and advice, what would you recommend? Do nothing fancy. Mark the report folder no follow. Try to do something with rel=cannonical to point those pages to the root page? Thoughts?
Technical SEO | | bwb0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Google stopped crawling my site. Everybody is stumped.
This has stumped the Wordpress staff and people in the Google Webmasters forum. We are in Google News (have been for years), and so new posts are crawled immediately. On Feb 17-18 Crawl Stats dropped 85%, and new posts were no longer indexed (not appearing on News or search). Data highlighter attempts return "This URL could not be found in Google's index." No manual actions by Google. No changes to the website; no custom CSS. No Site Errors or new URL errors. No sitemap problems (resubmitting didn't help). We're on wordpress.com, so no odd code. We can see the robot.txt file. Other search engines can see us, as can social media websites. Older posts still index, but loss of News is a big hit. Also, I think overall Google referrals are dropping. We can Fetch the URL for a new post, and many hours later it appears on Google and News, and we can then use Data Highlighter. It's now 6 days and no recovery. Everybody is stumped. Any ideas? I just joined, so this might be the wrong venue. If so, apologies.
Technical SEO | | Editor-FabiusMaximus_Website0 -
Google using descriptions from other websites instead of site's own meta description
In the last month or so, Google has started displaying a description under links to my home page in its search results that doesn't actually come from my site. I have a meta description tag in place and for a very limited set of keywords, that description is displayed, but for the majority of results, it's displaying a description that appears on Alexa.com and a handful of other sites that seem to have copied Alexa's listing, e.g. similarsites.com. The problem is, the description from these other sites isn't particularly descriptive and mentions a service that we no longer provide. So my questions are: Why is Google doing this? Surely that's broken behaviour. How do I fix it?
Technical SEO | | antdesign0 -
Will 301 redirecting a site multiple times still preserve the original site value?
Hi, All! If site www.abc.com was already 301 redirected to site www.def.com, and now the site owner wants to redirect www.def.com to www.ghi.com - is there any concern that it's not going to work, and some of the original linkjuice, rank, trust, etc. is going to vanish? Or as long as the 301s are set up right, should you be able to 301 indefinitely? Does anyone have any experience with actually doing this and seeing good/bad/neutral results? Thanks in advance! -Aviva B
Technical SEO | | debi_zyx0 -
Google Confusion: Two Sites and a 301 Redirect.
Hi, We have a client who just sprang a new project on us. As always, they went ahead and did some stuff before bringing us into the loop! (oh the joy of providing SEO services!) Anyway, i'm pretty swamped right now and need some extra brains on this. Basically the client had www.examplesiteA.com online for many years (an affiliate site which had built up a strong brand in the industry). They have now decided to turn this affiliate site into a full blown service platform and so with the new site being built they 301'd the whole thing over to www.examplesiteB.com - this is where they want all the old affiliate content to be hosted. So essentially examplesiteA.com is now examplesiteB.com and a new site is being placed on examplesiteA.com - still with me? So this has all happened and a brand new website is on examplesiteA.com and the old examplesiteA is now sitting exactly as it used to, but on the examplesiteB domain. The 301 redirect has been removed and the new examplesiteA seems to have been crawled, but the homepage is not indexed. When you search for examplesiteA, examplesiteB is the top result. Now they are similar domain names and to be fair I have very little data at this point i.e. I don't know when the 301 redirect was removed and it maybe that this all fixes itself with time. How is link equity effected now that examplesiteA.com was 301 redirected to examplesiteB.com and cached in this way, but now the 301 redirect has been removed and does not exist? Would link juice have been diluted throughout the process? Obviously if we had been in on all this before anything was implemented we would have done things differently. Interested to hear what others would do coming in at this point. Thanks and look forward to the advice!
Technical SEO | | MarcLevy0 -
Issue with Joomla Site not showing in SERP's
Site: simpsonelectricnc dot com I'm working on a Joomla website for a local business that isn't ranking at all for any relevant keyword - including the business name. The site is only about six months old and has relatively few links. I realize it takes time to compete for even low-volume keywords, but I think something else may be preventing the site from showing up. The site is not blocked by Robots.txt (which includes a valid reference to the sitemap)
Technical SEO | | CGR-Creative
There is no duplicate content issue, the .htaccess is redirecting all non-www traffic to www version
Every page has a unique title and H1 tag.
The URL's are search-engine friendly (not dynamic either)
XML sitemap is live and submitted to Google WM Tools. Google shows that it is indexing about 70% of the submitted URL's. The site has essentially no domain authority (0.02) according to Moz - I'm assuming this is due to lack of links and short life on the web.
Until today, 98% of the pages had identical meta descriptions. Again, I realize six months is not an eternity - but the site will not even show up for "business name + city,state" searches in Google. In fact, the only way I've seen it in organic results is to search for the exact URL. I would greatly appreciate any help.0