Http Response on bulk list
-
Do you know any tool that can find the http response code for a bulk list of urls?
-
bless your little heart erica
-
I found this one. (Haven't used it, bu it looks like what you need!)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible to direct HTTP www 301 to HTTPS non www?
I have a question that has been stumping me and if someone could help I would gladly buy your coffee for a month. I have a website that used to be www and http a year or two ago. Now it is https and non www. A lot of my older links point to the www and http version of my site. This results in two 301 redirects. I.e. A link on another site to my site points to http://www.mysite.com The network waterfall shows: http://www.mysite.com 301 -> http://mysite.com http://mysite.com 301 -> https://mysite.com https://mysite.com (finally) **2 part question. ** **--Do you think that this two 301 redirect hop would affect SEO performance? I can see it did affect page authority through Moz. ** --Is there away around this? I.e. to redirect http:// AND http://www directly to https:// with no hops in between. Thank you!
Intermediate & Advanced SEO | | Stodzy0 -
HTTP HTTPS Migration Gone Wrong - Please Help!
We have a large (25,000 Products) ecommerce website, and we did an HTTP=>HTTPS migration on 3/14/17, and our rankings went in the tank, but they are slowly coming back. We initially lost 80% of our organic traffic. We are currently down about 50%. Here are some of the issues. In retrospect, we may have been too aggressive in the move. We didn't post our old sitemaps on the new site until about 5 days into the move. We created a new HTTPS property in search console. Our redirects were 302, not 301 We also had some other redirect issues We changed our URL taxonomy from http://www.oursite.com/category-name.html to https://www.oursite.com/category-name (removed the .html) We changed our filters plugin. Proper canonicals were used, but the filters can generate N! canonical pages. I added some parameters (and posted to Search Console) and noindex for pages with multiple filter choices to cut down on our crawl budget yesterday. Here are some observations: Google is crawling like crazy. Since the move, 120,000+ pages per day. These are clearly the filtered pages, but they do have canonicals. Our old sitemaps got error messages "Roboted Out". When we test URLs in Google's robots.txt tester, they test fine. Very Odd. At this point, in search console
Intermediate & Advanced SEO | | GWMSEO
a. HTTPS Property has 23,000 pages indexed
b. HTTP Property has 7800 pages indexed
c. The crawl of our old category sitemap (852 categories) is still pending, and it was posted and submitted on Friday 3/17 Our average daily organic traffic in search console before the move was +/-5,800 clicks. The most recent Search Console had HTTP: 645 Clicks HTTPS: 2000 clicks. Our rank tracker shows a massive drop over 2 days, bottoming out, and then some recovery over the next 3 days. HTTP site is showing 500,000 backlinks. HTTPS is showing 23,000 backilinks. I am planning on resubmitting the old sitemaps today in an attempt to remap our redirects to 301s. Is this typical? Any ideas?0 -
Bulk redirect or only a few pages at a time
Dear all, I would very much like to have your advise about whether or not to implement bulk 301 redirects. We have 3 retail websites with the same technical architecture, namely: Netherlands-example.nl Belgium-example.be France-example.fr These three websites are all bilingual, namely: Netherlands-example.nl/nl Netherlands-example.nl/fr Belgium-example.be/nl Belgium-example.be/fr France-example.fr/nl France-example.fr/fr We’re going to do a CMS update and therefore we have to change a bulk of 301 redirects: Part 1: For France (France-example.fr) URL’s in the Dutch language (France-example.fr/nl) will be redirected to Belgium (Belgium-example.be/nl). It’s a matter of about 8.000 redirects. Part 2: For the Netherlands (Netherlands-example.nl) URL’s in the French language (Netherlands-example.nl/fr ) will be redirected to Belgium (Belgium-example.be/fr). It’s also a matter of about 8.000 redirects. Question:
Intermediate & Advanced SEO | | footsteps
What will be the best way to implement these redirects? Fully implement part 1 first (8.000 redirects) and then a couple of weeks/months later a full implement of part 2? Or will it be better to implement small batches like 200-500 per 2 weeks? I’d like to hear your opinion. Thanks in advance. Kind regards, Gerwin0 -
Wise or cluttery for a website? Should our "out of the mainstream" of popular products be listed on our site? (older/discontinued, umfamiliar brands, parts to products, etc...)
For instance, should we list replacement parts for a music stand? Or parts for a trumpet, like a valve button? To some, this seems like a cluttery thing to do. I suppose another way to ask would be, "Should we only list the high quantity selling items that are well branded and that everyone shops for, and leave the rest off the website for instore customers only to buy?" (FYI: Our website focus is for our local market mainly, and we're not trying to take on the world per-say, but if the world wants in, that's cool too.) (My thought here is that if a customer walks into our retail store and they request an odd ball part or item... we go hunting for it and find it for them. Or perhaps another Music Store needs a part? To me, it's ALL for sale,... right? Our retail depth, should be reflected in our online presence as much as possible,... correct? I'd personally choose to list the odd balls on our site, just as if a customer was standing in the store. Another side thought is, if we only list the main stream products... we are basically lessening our content (which could affect our rankings) and would be inviting ourselves into a higher competitive market place because we wouldn't be saying anything different than what most other music store sites out there say. I believe we need to show off our uniqueness,... and product depth (of course w/good SEO & content too) is really kinda it, aside of course also from good expert people and a large facility. But perhaps that's a wrong way to look at it?) Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
Duplicate pages with http and https
Hi all, We changed the payment part of our site to https from http a while ago. However once on the https pages, all the footer and header links are relative URLs, so once users have reached the payment pages and then re-navigate back to other pages in our website they stay on https. The build up of this happening has led to Google indexing all our pages in https (something we did not want to happen), and now we are in the situation where our homepage listing on Google is https rather than http. We would prefer the organic listings to be http (rather than https) and having read lots on this (included the great posts on the moz (still feels odd not refering to it as seomoz!) blog around this subject), possible solutions include redirects or a canoncial tags. My additional questions around these options are: 1. We already have 2 redirects on some pages (long story), will another one negatively impact our rankings? 2. Is a canonical a strong enough hint to Google to stop Google indexing the https versions of these page to the extent that out http pages will appear in natural listings again? If anyone has any other suggestions or other ideas of how to address this issue, that would be great! Thanks 🙂 Diana
Intermediate & Advanced SEO | | Diana.varbanescu0 -
Question about HTTP Vary for Mobile
I'm reviewing https://developers.google.com/webmasters/smartphone-sites/redirects, and wondering where exactly to add HTTP Vary: Desktop request which has a mobile page to add “Vary: User-Agent” to the response HEADER Or if the request came from mobile device, than add “Vary: User-Agent” to the response HEADER
Intermediate & Advanced SEO | | nicole.healthline0 -
Push for site-wide https, but all pages in index are http. Should I fight the tide?
Hi there, First Q&A question 🙂 So I understand the problems caused by having a few secure pages on a site. A few links to the https version a page and you have duplicate content issues. While there are several posts here at SEOmoz that talk about the different ways of dealing with this issue with respect to secure pages, the majority of this content assumes that the goal of the SEO is to make sure no duplicate https pages end up in the index. The posts also suggest that https should only used on log in pages, contact forms, shopping carts, etc." That's the root of my problem. I'm facing the prospect of switching to https across an entire site. In the light of other https related content I've read, this might seem unecessary or overkill, but there's a vaild reason behind it. I work for a certificate authority. A company that issues SSL certificates, the cryptographic files that make the https protocol work. So there's an obvious need our site to "appear" protected, even if no sensitive data is being moved through the pages. The stronger push, however, stems from our membership of the Online Trust Alliance. https://otalliance.org/ Essentially, in the parts of the internet that deal with SSL and security, there's a push for all sites to utilize HSTS Headers and force sitewide https. Paypal and Bank of America are leading the way in this intiative, and other large retailers/banks/etc. will no doubt follow suit. Regardless of what you feel about all that, the reality is that we're looking at future that involves more privacy protection, more SSL, and more https. The bottom line for me is; I have a site of ~800 pages that I will need to switch to https. I'm finding it difficult to map the tips and tricks for keeping the odd pesky https page out of the index, to what amounts to a sitewide migratiion. So, here are a few general questions. What are the major considerations for such a switch? Are there any less obvious pitfalls lurking? Should I even consider trying to maintain an index of http pages, or should I start work on replacing (or have googlebot replace) the old pages with https versions? Is that something that can be done with canonicalization? or would something at the server level be necessary? How is that going to affect my page authority in general? What obvious questions am I not asking? Sorry to be so longwinded, but this is a tricky one for me, and I want to be sure I'm giving as much pertinent information as possible. Any input will be very much appreciated. Thanks, Dennis
Intermediate & Advanced SEO | | dennis.globalsign0