404 errors & old unused pages
-
I am using shopify and I need to delete some old pages which are coming up as 404 errors (product no longer available!) does anyone know where you go to delete these pages which are no longer needed?
-
If you no longer carry that specific product, you can redirect the dead URL to a collection related to that product on your website. If there is no related collection, then redirect it to your homepage.
Read Shopify's instructions on how to do a redirect in there platform here: https://help.shopify.com/en/manual/migrating-to-shopify/considerationsd
-
I would have thought that if you are getting 404s instead of product pages, those pages were already deleted!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal linking: Repeating same low level pages from high hierarchy level pages
Hi all, We have 3 editions of our product we are trying to rank better. Some of our features level pages from these editions are repeating in these 3 editions. Exactly like below example: clothes.com/cotton-fabrics/shirts clothes.com/wool-fabrics/shirts clothes.com/polyester-fabrics/shirts "Shirts" pages repeat in "cotton-fabrics", "wool-fabrics" and "polyester-fabrics". We have added rel=canonical to rank "shirts" in rank only one category. I wonder do we need to take any other measures to make sure that these pages don't affect us negatively. Thanks
Web Design | | vtmoz0 -
Too Many Links on One Page - What to Do?!
Hello Geniuses, Prodigies, and Experts of the Field, My website pages for www.1099pro.com have too many links on one page, something like 150-175, and I understand that each page should ideally be under 100. Most of these links, approx 105, come from dropdown navigation options in the header toolbar or the footer links. It is my take that these links make our site easier to navigate but I'm sure that they are hurting my pagerank / SERPs. Is there a best way to handle a situation like this? I'd really prefer not to alter the header/footer layout of the entire site by removing 50-75 navigational links. The only other idea I have is below but I have no idea if it would work. For any link that I do not care to pass pagerank, institute a "nofollow" parameter. This would be my favorite option if it is viable.
Web Design | | Stew2220 -
Is there a Joomla! Component For A Blog Page That Is Recommended?
A business partner currently has a page on a Joomla! website that is passing for the blog page. I am not a Joomla! guy so I dont' know much about it. I do know that I don't like a lot of things and prefer Drupal however making a change to Drupal on that site is not an option. We need to upgrade the blog page so that it is more like a blog and I know there has to be an SEO friendly component for a Joomla! blog page. Any ideas?
Web Design | | Atlanta-SMO1 -
Why aren't Images in G+ product page posts showing up in SERPs for brand searches?
Before 1-2 weeks ago, our G+ posts containing links to our product pages would show up in in SERPs (when searching for our brand name) with a thumbnail of the product image. Now, they do not (see image below for visual). Our tech team confirmed there hasn't been any coding change that might be to blame and I see that this isn't happening to other sites. Any idea what may be the problem here? tcnhLgy
Web Design | | znotes0 -
AJAX endpoints returning 404 errors in GWT. Why!?!?
Hi guys, So I'm working through a large dataset of 404 errors and trying to clean up the site's crawl-ability. A piece of the puzzle I can't seem to wrap my head around has to do with AJAX endpoints. It looks like GWT thinks these are URLs that don't exist and, therefore, is reporting them as 404 errors. Anyone experience this before?
Web Design | | brad_dubs0 -
301 forwarding during site migration problem - several url versions of the same page....
Hello, I'm migrating from an old site to a new site, and 301 forwarding many of the pages... My key problem is this I'm seeing www.website.com/ indexed in SE and www.website.com/default.aspx in showing as URL when I'm on homepage - should I simply 301 forward both of these? Then for several internal pages there are 2/3 versions of each page indexed. Canonicalization issues. Again, I'm wondering whether I should 301 forward each URL even if there are several different indexed URLs for the same page? Your advice will be welcome! Thanks in advance - Luke
Web Design | | McTaggart0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0 -
Ruby on Rails & MemCaching
Anyone else buildling their website using Ruby on Rails? If so, are you also using any memcaching? Do you have any tips on memcached concepts, tricks, caveats, and experiences. How are you also doing SEO, any tricks for generating traffic and some Google love?
Web Design | | Goetzman0