Redirects and site map isn't showing
-
We had a malware hack and spent 3 days trying to get Bluehost to fix things. Since they have made changes 2 things are happening:
1. Our .xml sitemap cannot be created https://www.caffeinemarketing.co.uk/sitmap.xml we have tried external tools
2. We had 301 redirects from the http (www and non www versions) nad the https;// (non www version) throughout the whole website to https://www.caffeinemarketing.co.uk/ and subsequent pages
Whilst the redirects seem to be happening, when you go into the tools such as https://httpstatus.io every version of every page is a 200 code only whereas before ther were showing the 301 redirects
Have Bluehost messed things up? Hope you can help
thanks
-
I agree with what effectdigital said. It looks like everything is in place and your non-www and you http versions of the website are redirecting to the https-www version of the site.
-
That attachment shows that non HTTPS and non WWW URLs are being 301 redirected to the HTTPS-WWW version(s). That's what you want right? From your screenshot it seems like it is working how you want
Just so you know, when you put one architecture into Screaming Frog (e.g: you put in HTTP with no WWW), it doesn't limit the crawl to that specific architecture. If the crawler is redirected from non-WWW non HTTPS to HTTPS with WWW, then the crawler will carry on crawling THAT version of the site
If you wanted to crawl all of the old HTTP-non-WWW URLs, you would need to list all of them for SF in list mode and alter the crawlers settings to 'contain' it to just the list of URLs which you entered. I'm pretty sure then, you would see that most of the HTTP-non-WWW URLs are properly redirecting as they should be
As for the XML thing it's very common especially for people using Yoast. I think Yoast is really good by the way, but for some reason, on some hosting environments the XML sitemap starts blank-rendering. Most of the time hosting companies say they can't fix it and it's Yoast's fault but I don't really believe that. If a file (e.g: sitemap.xml) cannot be created, it's more likely they went in via FTP and changed some file read/write permissions and due to it being more locked down, the XML cannot be created anymore. If you were hacked by malware, they were likely over-zealous when locking your site back down and it's causing problems for your XML feed(s)
-
see attachement
-
Hi, are you able to please interpret this for me. It looks like the non www versions are showing as https://www version on 200. the home page looks like the only 301???
-
Hi Carrie,
For your 301 redirects on the root level, it sounds like the .htaccess file has changed on the server. Can you try validating those other http and non-www versions of the website through other tools like ScreamingFrog? If you're still getting 200 response codes, I would advise raising the issue with Bluehost as this is something they can fix.
As for the XML sitemap, do you mean that you're unable to upload a file to that location? Have you tried sFTP?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is Coyscape showing content duplication error even after implementing 301 redirect ?
We are maintaining the corporate website of one of our prestigious clients "FineTech Toolings" (http://www.finetechtoolings.in). Recently I had raised a question regarding "2 websites running paralley in 2 diferent domains, i.e. 1 organisation having 2 different websites on 2 different domains". Recently my domain changed from http://www.finetechtoolings.co.in to http://www.finetechtoolings.in via 301 redirect, but still I am facing content duplication issue as per Copyscape. Hence I am having a small doubt regarding the same. Please note the following question very carefully and provide me the exact problem and the solution for the same: Even though I have implemented 301 redirect (http://www.finetechtoolings.co.in is redirected to http://www.finetechtoolings.in), which is completely ok as per the SEO rules, why is copyscape still showing that duplicate content exists in the former website? I think I am clear enough with my question.
Technical SEO | | KDKini0 -
An article we wrote was published on the Daily Business Review, we'd like to post it on our site. What is the proper way?
Part 1
Technical SEO | | peteboyd
We wrote an article and submitted it to the Daily Business Review. They published the article on their website. We want to also post the article on our website for our users but we want to make sure we are doing this properly. We don't want to be penalized for duplicating content. Is this the correct way to handle this scenario written below? We added a rel="canonical" to the blog post (on our website). The rel="canonical" is set to the Daily Business Review URL where the article was originally published. At the end of the blog post we wrote. "This article was originally posted on The Daily Business Review." and we link to the original post on the Daily Business Review. Should we be setting the blog post (on our website) to be a "noindex" or rel="canonical" ? Part 2 Our company was mentioned in a number of articles. We DID NOT write those articles, we were only mentioned. We have also posted those same articles on our website (verbatim from the original article). We want to show our users that we have been mentioned in highly credited articles. All of these articles were posted on our website and are set to be a "noindex". Is that the correct thing to do? Should we be using a rel="canonical" instead and pointing to the original article URL? Thanks in advance MOZ community for your assistance! We tried to do the leg work of our own research for the answers but couldn't find the exact same scenario that we are encountering**.**0 -
How to solve the meta : A description for this result is not available because this site's robots.txt. ?
Hi, I have many URL for commercialization that redirects 301 to an actual page of my companies' site. My URL provider say that the load for those request by bots are too much, they put robots text on the redirection server ! Strange or not? Now I have a this META description on all my URL captains that redirect 301 : A description for this result is not available because this site's robots.txt. If you have the perfect solutions could you share it with me ? Thank You.
Technical SEO | | Vale70 -
Duplicate Page Content error but I can't see it
Hi All We're getting a lot of Duplicate Page Content errors but I can't match it up. For example this page: http://www.daytripfinder.co.uk/attractions/32-antique-cottage It is saying the on page properties as follows: Title DayTripFinder - Things to do reviewed by you - 7,000 attractions <dt style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;">Meta Description</dt> <dt style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;">Read Reviews, Browse Opening Hours and Prices. View Photos, Maps. 7,000 UK Visitor Attractions.</dt> <dt style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;">But this isn't the page title or meta description.
Technical SEO | | KateWaite85
</dt> <dt style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;">And it's showing five (many others) example pages that share it. Again the page titles and description are different.</dt> <dt style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;">http://www.daytripfinder.co.uk/attractions/mckinlay-theatre</dt> <dt style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;">http://www.daytripfinder.co.uk/attractions/bakers-dolphin</dt> <dt style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;">http://www.daytripfinder.co.uk/attractions/shipley-park-fishing</dt> <dt style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;">http://www.daytripfinder.co.uk/attractions/king-johns-lodge-and-gardens</dt> <dt style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;">http://www.daytripfinder.co.uk/attractions/city-hall
</dt> Any ideas? Not sure if I'm missing something here! Thanks!0 -
I add microdata but why Google don't show it in SERP?
Site is: http://www.lightinthebox.com/, I've already added microdata for all product pages a month ago. And I used google Rich Snippets Testing Tool which shows me everything is all right. Like: http://www.lightinthebox.com/ouku-horizon-3g-android-smart-phone-with-3-5-inch-capacitive-touchscreen-800mhz-wifi-gps_p225435.html But Google just don't show the Rich Snippets in SERP. Any idea?? Thanks!
Technical SEO | | Litb0 -
Site 'filtered' by Google in early July.... and still filtered!
Hi, Our site got demoted by Google all of a sudden back in early July. You can view the site here: http://alturl.com/4pfrj and you may read the discussions I posted in Google's forums here: http://www.google.com/support/forum/p/Webmasters/thread?tid=6e8f9aab7e384d88&hl=en http://www.google.com/support/forum/p/Webmasters/thread?tid=276dc6687317641b&hl=en Those discussions chronicle what happened, and what we've done since. I don't want to make this a long post by retyping it all here, hence the links. However, we've made various changes (as detailed), such as getting rid of duplicate content (use of noindex on various pages etc), and ensuring there is no hidden text (we made an unintentional blunder there through use of a 3rd party control which used CSS hidden text to store certain data). We have also filed reconsideration requests with Google and been told that no manual penalty has been applied. So the problem is down to algorithmic filters which are being applied. So... my reason for posting here is simply to see if anyone here can help us discover if there is anything we have missed? I'd hope that we've addressed the main issues and that eventually our Google ranking will recover (ie. filter removed.... it isn't that we 'rank' poorly, but that a filter is bumping us down, to, for example, page 50).... but after three months it sure is taking a while! It appears that a 30 day penalty was originally applied, as our ranking recovered in early August. But a few days later it dived down again (so presumably Google analysed the site again, found a problem and applied another penalty/filter). I'd hope that might have been 30 or 60 days, but 60 days have now passed.... so perhaps we have a 90 day penalty now. OR.... perhaps there is no time frame this time, simply the need to 'fix' whatever is constantly triggering the filter (that said, I 'feel' like a time frame is there, especially given what happened after 30 days). Of course the other aspect that can always be worked on (and oft-mentioned) is the need for more and more original content. However, we've done a lot to increase this and think our Guide pages are pretty useful now. I've looked at many competitive sites which list in Google and they really don't offer anything more than we do..... so if that is the issue it sure is puzzling if we're filtered and they aren't. Anyway, I'm getting wordy now, so I'll pause. I'm just asking if anyone would like to have a quick look at the site and see what they can deduce? We have of course run it through SEOMoz's tools and made use of the suggestions. Our target pages generally rate as an A for SEO in the reports. Thanks!
Technical SEO | | Go2Holidays0 -
Removing a site from Google's index
We have a site we'd like to have pulled from Google's index. Back in late June, we disallowed robot access to the site through the robots.txt file and added a robots meta tag with "no index,no follow" commands. The expectation was that Google would eventually crawl the site and remove it from the index in response to those tags. The problem is that Google hasn't come back to crawl the site since late May. Is there a way to speed up this process and communicate to Google that we want the entire site out of the index, or do we just have to wait until it's eventually crawled again?
Technical SEO | | issuebasedmedia0 -
301 redirect on the root of the site
Due to some historic difficulties with our URL Rewriter, we are in the position of having the root of our site 301 redirected to another page. So the root of our site: http://www.propertylive.co.uk/ has a 301 redirect to: http://www.propertylive.co.uk/home.aspx We're aware that this isn't great and we're working to fix this completely, but what impact will this have on our SEO?
Technical SEO | | LianWard860