Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Cookies disabled pointing to a 404 page
-
Hi mozzers,
I am running an audit and disabled cookies on our homepage for testing purposes, this pointed to a 404 http response? I tried on other pages and they were loading correctly.
I assume this is not normal? Why this is happening? and could this harm the site's SEO?
Thanks!
-
Well... Well, that could be many things. Assuming it's WordPress, I would clear your program reserve for 1 thing. Then, at that point, I would check whether recovering the PermaLinks fixes it.
Assuming you're not basically keeping crawlers from creeping your site with 404 not found, It is improbable you will experience any SEO issues. In any case, I would you unquestionably need to fix this mistake and know the reason.
Simply go down an agenda or two till you find and pinpoint the issue. That is assuming it isn't the program or the permalinks which is the thing that it is a decent larger part of the time
-
Hello there,
I would always check with more than my own browser to make sure it wasn’t only happen for myself. You can try to use the curl tools from keycdn, or ultimately you can also check with google itself by using the “fetch with google” tool in your Search Console.
If the problem presist you should check if there is any script on your site could potentially cause the issue.
Hope this helps,
Joseph Yap
-
Hmmm... Well that could definitely be a lot of different things. If it's WordPress, I would clear your browser cache for 1 thing. Then I would see if regenerating the PermaLinks fixes it.
If you're not essentially preventing crawlers from crawling your site with 404 not founds, It is unlikely you will encounter any SEO issues. However, I would you certainly want to fix this error and know the cause.
Just go down a checklist or two till you find and pinpoint the problem. That is if it isn't the browser or the permalinks which is what it is a good majority of the time

Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Website Page Speed is not increasing
HEY EXPERTS, My website page speed is not increasing. I used the wp rocket plugin but still, I am facing errors of Reduce unused CSS, Properly size images, and Avoid serving legacy JavaScript to modern browsers. you can see in the image Screenshot (7).png I used many plugins for speed optimization but still facing errors. I optimized the images manually by using photoshop but still, I am facing the issue of images size. After Google Core Web Vital Update my website keyword position is down due to slow speed. Please guide me on how I increase the page speed of my website https://karmanwalayfabrics.pk Thanks
Technical SEO | | frazashfaq110 -
How to find orphan pages
Hi all, I've been checking these forums for an answer on how to find orphaned pages on my site and I can see a lot of people are saying that I should cross check the my XML sitemap against a Screaming Frog crawl of my site. However, the sitemap is created using Screaming Frog in the first place... (I'm sure this is the case for a lot of people too). Are there any other ways to get a full list of orphaned pages? I assume it would be a developer request but where can I ask them to look / extract? Thanks!
Technical SEO | | KJH-HAC1 -
404 Error Pages being picked up as duplicate content
Hi, I recently noticed an increase in duplicate content, but all of the pages are 404 error pages. For instance, Moz site crawl says this page: https://www.allconnect.com/sc-internet/internet.html has 43 duplicates and all the duplicates are also 404 pages (https://www.allconnect.com/Coxstatic.html for instance is a duplicate of this page). Looking for insight on how to fix this issue, do I add an rel=canonical tag to these 60 error pages that points to the original error page? Thanks!
Technical SEO | | kfallconnect0 -
Backlinks that we have if they are 404?
Hi All, Backlinks that we have if they are 404? Open site explorer shows 1,000 of links and when I check many are 404 and those are spammy links which we had but now the sites are 404 I am doing a link profile check which is cleaning up all spammy links Should i take any action on them? As open site explorer or Google still shows these links on the searches. Should we mention these URL's in disallow in Google webmaster. Thanks
Technical SEO | | mtthompsons0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Determining When to Break a Page Into Multiple Pages?
Suppose you have a page on your site that is a couple thousand words long. How would you determine when to split the page into two and are there any SEO advantages to doing this like being more focused on a specific topic. I noticed the Beginner's Guide to SEO is split into several pages, although it would concentrate the link juice if it was all on one page. Suppose you have a lot of comments. Is it better to move comments to a second page at a certain point? Sometimes the comments are not super focused on the topic of the page compared to the main text.
Technical SEO | | ProjectLabs1 -
What should be use 301 or 302 redirection for 404 pages
Please suggest which redirection we should use for 404 pages- 301 or 302. If you can elaborate it with reason then it will be highly appreciated.
Technical SEO | | koamit0 -
Thoughts about stub pages - 200 & noindex ok, or 404?
With large database/template driven websites it is often possible to get a lot of pages with no content on them. What are the current thoughts regarding these pages with no content, options; Return a 200 header code with noindex meta tag Return a 404 page & header code Something else? Thanks
Technical SEO | | slingshot0