Redirects and site map isn't showing
-
We had a malware hack and spent 3 days trying to get Bluehost to fix things. Since they have made changes 2 things are happening:
1. Our .xml sitemap cannot be created https://www.caffeinemarketing.co.uk/sitmap.xml we have tried external tools
2. We had 301 redirects from the http (www and non www versions) nad the https;// (non www version) throughout the whole website to https://www.caffeinemarketing.co.uk/ and subsequent pages
Whilst the redirects seem to be happening, when you go into the tools such as https://httpstatus.io every version of every page is a 200 code only whereas before ther were showing the 301 redirects
Have Bluehost messed things up? Hope you can help
thanks
-
I agree with what effectdigital said. It looks like everything is in place and your non-www and you http versions of the website are redirecting to the https-www version of the site.
-
That attachment shows that non HTTPS and non WWW URLs are being 301 redirected to the HTTPS-WWW version(s). That's what you want right? From your screenshot it seems like it is working how you want
Just so you know, when you put one architecture into Screaming Frog (e.g: you put in HTTP with no WWW), it doesn't limit the crawl to that specific architecture. If the crawler is redirected from non-WWW non HTTPS to HTTPS with WWW, then the crawler will carry on crawling THAT version of the site
If you wanted to crawl all of the old HTTP-non-WWW URLs, you would need to list all of them for SF in list mode and alter the crawlers settings to 'contain' it to just the list of URLs which you entered. I'm pretty sure then, you would see that most of the HTTP-non-WWW URLs are properly redirecting as they should be
As for the XML thing it's very common especially for people using Yoast. I think Yoast is really good by the way, but for some reason, on some hosting environments the XML sitemap starts blank-rendering. Most of the time hosting companies say they can't fix it and it's Yoast's fault but I don't really believe that. If a file (e.g: sitemap.xml) cannot be created, it's more likely they went in via FTP and changed some file read/write permissions and due to it being more locked down, the XML cannot be created anymore. If you were hacked by malware, they were likely over-zealous when locking your site back down and it's causing problems for your XML feed(s)
-
see attachement
-
Hi, are you able to please interpret this for me. It looks like the non www versions are showing as https://www version on 200. the home page looks like the only 301???
-
Hi Carrie,
For your 301 redirects on the root level, it sounds like the .htaccess file has changed on the server. Can you try validating those other http and non-www versions of the website through other tools like ScreamingFrog? If you're still getting 200 response codes, I would advise raising the issue with Bluehost as this is something they can fix.
As for the XML sitemap, do you mean that you're unable to upload a file to that location? Have you tried sFTP?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl solutions for landing pages that don't contain a robots.txt file?
My site (www.nomader.com) is currently built on Instapage, which does not offer the ability to add a robots.txt file. I plan to migrate to a Shopify site in the coming months, but for now the Instapage site is my primary website. In the interim, would you suggest that I manually request a Google crawl through the search console tool? If so, how often? Any other suggestions for countering this Meta Noindex issue?
Technical SEO | | Nomader1 -
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
Organic search traffic has dropped by 35% since 18 September, we don't know why.
Organic traffic to our website has dropped 35% since 18 September 2017 to date. From 1 January to 18 September 2017 organic traffic was up by just under 1% over all (Google up by 1.32%). Paid search traffic over the same time has remained steady. There is nothing we can think of that we've done that has caused the drop. We had an issue with Google page speed test failing when running a test but we resolved this issue on 20 November and in that time we've seen an even greater drop (44% in the last week). The drop is seen across the 3 main search engines, not just Google, which points toward something we've done, but as mentioned, we can't think of any significant change we made in September that would have such negative effects. There is little difference across devices. Is anyone aware of a significant event in September in the search engine world that may have influenced our organic traffic? Any help gratefully received.
Technical SEO | | imaterus0 -
Webmaster tools Hentry showing pages that don't exist
In Webmaster Tools I have a ton of pages listed under Structured Data >> Hentry. These pages are not on my website and I don't know where they are coming from. I redid the site for someone and perhaps they are from the old site. How do I find and delete these? Thank you Rena
Technical SEO | | renalynd270 -
WMT only showing half of a newly submitted XML site map
After upgrading design and theme on a relatively high traffic wordpress site, I created an XML site map through Yoast SEO since WP Engine didn't allow the old XML site map plugin I was using. A site:www.mysite.com search shows Google is indexing about 1,100 pages on my site, yet the XML site map I submitted shows "458 URLs submitted and 467 URLs indexed." These numbers are about 1/2 of what they should be. My old site map had about 1,100 URLs and 965 or so indexed (used noindex on some low value pages.) Any ideas as to what may be wrong?
Technical SEO | | JSOC0 -
Redirecting the .com of our site
Hey guys, A company I consult for has a different site for its users depending on the geography. Example: When a visitor goes to www.company.com if the user is from the EU, it gets redirected to http://eu.company.com If the user is from the US, it goes to http://us.company.com And so on. I have two questions: Does having a redirect on the .com will influence rankings on each specific sub-site? I suspect it will affect the .com since it will simply not get indexed but not sure if affects the sub domains. The content on this sub-sites are not different (I´m still trying to figure out why they are using the sub-domains). Will they get penalized for duplicate content? Thanks!
Technical SEO | | FDSConsulting0 -
Internet Explorer and Chrome showing different SERP's
Well the title says it all really. Same query, different browsers, same computer and different search results. I thought at first it may have differed because I was logged into my google profile on chrome but I logged out and tested and still different results. Is this normal ?
Technical SEO | | blinkybill0 -
I have a penalized site and don't know what the cause is
I have a site which appears to have a Google indexation penalty. According to Google because its violating the T/Cs. Here are some background details about the site: The site is a online poker + deposit methods related site on a .co.uk TLD. It has 30+ uniquely written pages, and no advertising at the moment. In June of 2010, June 10 to be precisely, I bought this site from a fellow webmaster/affiliate. After the site 's ownership changed I tried accessing the server, but I couldn't log into it . I noticed that this host had serious problems and the IP was unreachable. After trying for some time the previous owner got me all the content in Word files and I created a new hosting account and re-launched the site on June 28. Between a couple of days after June 10 and June 28, the site was unreachable, and completely de-indexed from Google. When I re-launched the site, I used the default Wordpress Template Twenty Ten, and created new pages with the Word files I received from the previous owner. I waited a bit, but noticed the site didn't get re-indexed. So on August 18th I moved the content of domain xxx.com to yyy.co.uk/xxx/ and 301-ed all the former locations, hoping that this might help yyy.co.uk get indexed..... but nothing. On October 28 of 2010 I submitted my first reconsideration request, which was processed on November 17th without any change. At that time Google didn't say if anything was wrong like now, so I just waited... and waited... and waited some more. At some point I was ready to let this one go, as I didn't/don't see any problems with it. In fact, it used to be indexed before. By now, I removed all links pointing to it that I had control off, and there are hardly any left over. The site as well doesn't have any outgoing links left, so that can't be it either. I also removed a kind-a duplicate keyword heavy menu from the sidebar, as well as the widgets from the footer. Finally I also fixed a problem caused by Yoast Wordpress SEO Plugin, but I only installed this plugin recently, so that could not be the problem that caused the penalty. So after another reconsideration request Google again let me know this site still has issues, but I really have no clue which, or how to find out. I don't feel like doing any work on this site, as there is no guarantee that it will ever lose its penalty. What should I do now?
Technical SEO | | VisualSense0