Reclaiming Ranking positions in Google
-
We have a website we are working on that was ranking well in Google but since having a hosting upgrade has completely dropped in rankings.
When a hosting upgrade was made, the developer added an incorrect robots.txt file that restricted the site from being found, hence resulting in lost rankings. We have since sorted out that issue so the robots.txt is OK. However, ranking results have yet to be reclaimed. We are unsure why these rankings haven't rebounded back, as it has been a while now.
The site is https://www.brightonpanelworks.com.au. We have since also attempted to add a sitemap however to help the site be better crawled and to regain rankings, however, it appears that sitemap generators are having problems creating a sitemap for this site and we are not sure why. And we are not sure whether this may relate to why Google has not picked up on pages and ranking results have not be restored.
If you have any ideas as to how we can reclaim rankings to the strong positions they were in previously, that would be much appreciated. We believe we may be missing something here that is not allowing webpages to be picked up and ranked by Google.
-
Not a problem, happy to help
-
Thanks so much Mike! Got that export and appreciate you helping us here!
-
Yup, all looks good now - the page has index, follow
I'm going to send you a private message here on Moz with a Screaming Frog crawl export, which I carried out just to check there were no instances of 'noindex' left on any pages... All looks fine from the noindex standpoint.
-
Sorry for the late reply, checking now for you
-
Hi Mike,
Thanks for your help!!
Could you check now to ensure it is OK? A change was made and I believe the source code no longer contains the line:
However, when I look to create a sitemap through a sitemap generator to add to Webmaster Tools, I am still unable to do so, which makes me think that perhaps the problem has not been resolved.
Your further assistance here would be most appreciated.
-
Hi Gavo.
I think I've spotted your issue!
Looking at https://www.brightonpanelworks.com.au/robots.txt I can see that you're all good now, HOWEVER...
view-source:https://www.brightonpanelworks.com.au/ Check your source code and you'll notice an inline noindex tag!:
Also, checking another page: view-source:https://www.brightonpanelworks.com.au/services/ I get the same meta tag.
This makes me think it's sitewide... So, as you;re running WordPress:
- In wp-admin, go to Settings > Reading, then untick the box that discourages search engines.
Once you've done that, the following optional steps won't hurt and may speed things up
- Submit a new sitemap
- Use the fetch & render, as Kevin recommends
- Personally, and this is entirely conjecture on my part, I'd use Google's page speed & mobile friendly testing tools, as I find that 'seems' to help (I've not bothered verifying by running tests & checking the access log files to bot activity etc - as it only takes a few seconds to do so not worth the time. Maybe I'll check out of curiosity at some point though!)
** As there's been a couple of errors with indexing, once you've done this change, I'd recommend running a full site crawl (Moz's tools or Screaming Frog is cool too) and check for any other noindex pages!**
EDIT: In case you don't have Screaming Frog, let me know when you've updated the WordPress setting and I'll run a crawl for you and ping you a list of any pages that are still showing noindex tags, if any exist
-
Have you submitted your pages for "Fetching" through Webmaster Tools? Check when you were last crawled by Google, make some content upgrades and re-submit. That always gets us indexed within 48 hrs.
I use the Moz Page Optimization tool as a guide to edit our product pages, that helps immensely.
KJr
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking drop
Hello there, Based on Moz's rank tracker our keywords ranking have been dropping in ranking. Does anyone know what might be the cause? We have been building quality "white hat" links which are very relevant to our niche Thanks, Robert
Intermediate & Advanced SEO | | roberthseo0 -
Checking Rankings Again & Again Can Drop Rankings
Is it possible that if i check my google rankings again & again it can drop ranking?Like checking where do my keywords rank every hours rank drop the rankings? Because this indirectly affects the CTR. Might be because of it? No one has faced such an weird thing before.
Intermediate & Advanced SEO | | welcomecure0 -
Ranking on google but not Bing?
Any reason why I could be ranking for Google but not Bing?
Intermediate & Advanced SEO | | edward-may0 -
Google images
Hi, I am working on a website with a large number (millions) of images. For the last five months Ihave been trying to get Google Images to crawl and index these images (example page: http://bit.ly/1ePQvyd). I believe I have followed best practice in the design of the page, naming of images etc. Whilst crawlng and indexing of the pages is going reasonably well with the standard crawler, the image bot has only crawled about half a million images and indexed only about 40,000. Can anyone suggest what I could do to increase this number 100 fold? Richard
Intermediate & Advanced SEO | | RichardTay0 -
Homepage not ranking in Google AU, but ranking in Google UK?
Hey everyone, My homepage has not been ranking for it's primary keyword in Google Australia for many months now. Yesterday when I was using a UK Proxy and searching via Google UK I found my homepage/primary keyword ranked on page 8 in the UK. Now in Australia my website ranks on page 6 but it's for other pages on my website (and it always changes from different page to page). Previously my page was popping up at the bottom of page 1 and page 2. I've been trying many things and waiting weeks to see if it had any impact for over 4 months but I'm pretty lost for ideas now. Especially after what I saw yesterday in Google UK. I'd be very grateful if someone has had the same experience of suggestions and what I should try doing. I did a small audit on my page and because the site is focused on one product and features the primary keyword I took steps to try and fix the issue. I did the following: I noticed the developer had added H1 tags to many places on the homepage so I removed them all to make sure I wasn't getting an over optimization penalty. Cleaned up some of my links because I was not sure if this was the issue (I've never had a warning within Google webmaster tools) Changed the title tags/h tags on secondary pages not to feature the primary keyword as much Made some pages 'noindex' to try and see if this would take away the emphases on the secondary pages Resubmitted by XML sitemaps to Google Just recently claimed a local listings place in Google (still need to verify) and fixed up citations of my address/phone numbers etc (However it's not a local business - sells Australia wide) Added some new backlinks from AU sites (only a handful though) The only other option I can think of is to replace the name of the product on secondary pages to a different appreciation to make sure that the keyword isn't featured there. Some other notes on the site: When site do a 'site:url' search my homepage comes up at the top The site sometimes ranked for a secondary keyword on the front page in specific locations in Australia (but goes to a localised City page). I've noindexed these as a test to see if something with localisation is messing it around. I do have links from AU but I do have links from .com and wherever else. Any tips, advice, would be fantastic. Thanks
Intermediate & Advanced SEO | | AdaptDigital0 -
Homepage bombed from rankings
I'm working on a site that has historically had issues ranking the homepage. We cleaned up some on page issues and then it went into a high and low pattern - page 4 then page 12 etc (was static around page 9 before), settling at page 6. The link profile was not good and there were a high level of links that should have been no-follow as they were clearly looking paid for - we addressed this along with some other poor links. This effectively dropped ranking down to page 23, but not unexpected considering the very big drop in followed links. Meanwhile we have embarked on a fresh steady link building strategy with nice clean links, varied anchor text coming from varying DA domains, smattered with a few no-follow links - strongly focussing on being as natural as possible. At the Penguin update the homepage has totally disappeared. Frustratingly just after the update (same day) we removed a 301ed old domain from the profile. This was the old company URL which we discovered had a lot of spam linking associated with it. An oversight - there were other 301 domains that were removed some time ago which were totally unrelated to the main site and we were told all other domains were simply bought and redirected to stop hijacking - all but this were. Considering the work we have done would it be good assumption this domain 301 could be the underlying factor? So far organic traffic is steady, in fact a tad up. What would you guys do?
Intermediate & Advanced SEO | | MickEdwards0 -
How many links would you need to rank up in page rank?
White hat **** Can 20 website with page rank of 3 make your site rank higher?
Intermediate & Advanced SEO | | spidersite0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0