Squarespace Errors
-
We have a website hosted by SquareSpace. We are happy with SS, but have done some crawl diagnostics and noticed several errors.
These are primarily:
-
Duplicate Page Title
-
Duplicate Page Content
-
Client Error (4xx)
We dont really understand why these errors are taking place, and wonder if someone in the Seomoz forum has a firm understanding of SS who is able to assist us with this?
thanks.
-
-
Hi! Looks like Rand beat us to this one, but just a few quick things to add for ya.
I did a quick crawl of the site in Screaming Frog.
Looks like most of the duplicates titles are due to two things;
- Your tag pages all of the same name. There must be away to set your titles for tag pages to reflect the tag name.
- Its counting every page twice - one with the slash at the end (ie: /contact-us/) and without (ie: /contact-us)
(note: seems like the same for your descriptions and pages headers - this could be the "duplicate page content error")
So, fix your tag pages and see about why the pages are loading with a slash and without.
The 404s are there, but they look like some sort of squarspace issue with how images are cached or something - not actual pages on the site. Looks like one of your links to wikipedia has a stray character in it causing it to be a broken link.
I would verify/cross check these errors in Google Webmaster Tools and then Rand's suggestion is perfect for the 404s.
Hope between all of that it helps!
-Dan
-
Hi Jeremy - I've worked with the SquareSpace crew a bit personally (don't know the system inside and out, but have a reasonable grasp on it). Could you share the URLs that are being reported with the duplicate issues? It's likely a URL parameter that's relatively easy to fix with rel=canonical or the like.
As far as 404s - you might want to download the XLS for those and see which pages are being reported as pointing to the error pages. If you think they should be active, repair them, and if there's a structural problem, you may need to report to SquareSpace's customer service.
Sorry for my long delay!
Rand
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Errors Help - 350K Page Not Founds in 22 days
Got a good one for you all this time... For our site, Google Search Console is reporting 436,758 "Page Not Found" errors within the Crawl Error report. This is an increase of 350,000 errors in just 22 days (on Sept 21 we had 87,000 errors which was essentially consistently at that number for the previous 4 months or more). Then on August 22nd the errors jumped to 140,000, then climbed steadily from the 26th until the 31st reaching 326,000 errors, and then climbed again slowly from Sept 2nd until today's 436K. Unfortunately I can only see the top 1,000 erroneous URLs in the console, of which they seem to be custom Google tracking URLs my team uses to track our pages. A few questions: 1. Is there anyway to see the full list of 400K URLs Google is reporting they cannot find?
Intermediate & Advanced SEO | | usnseomoz
2. Should we be concerned at all about these?
3. Any other advice? thanks in advance! C0 -
What is the best way to correct 403 access denied errors?
One of the domains I manage is seeing a growing number of 403 errors. For SEO purposes would it be ideal to just 301 redirect them? I am plenty familiar with 404 error issues, but not 403s.
Intermediate & Advanced SEO | | RosemaryB0 -
Hreflang Tags with Errors in Google Webmaster Tools
Hello, Google Webmaster tools is giving me errors with Hreflang tags that I can't seem to figure out... I've double checked everything: all the alternate and canonical tags, everything seems to match yet Google finds errors. Can anyone help? International Targeting | Language > 'fr' - no return tags
Intermediate & Advanced SEO | | GlobeCar
URLs for your site and alternate URLs in 'fr' that do not have return tags.
Status: 7/10/15
24 Hreflang Tags with Errors Please see attached pictures for more info... Thanks, Karim KQgb3Pn0 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
Can 404 Errors Be Affecting Rankings
I have a client that we recently (3 months ago) designed, developed, and launch a new site at a "new" domain. We set up redirects from the old domain to the new domain and kept an eye on Google Webmaster Tools to make sure the redirects were working properly. Everything was going great, we maintained and improved the rankings for the first 2 months or so. In late January, I started noticing a great deal of 404 errors in Webmaster Tools for URLs from the new site. None of these URLs were actually on the current site so I asked my client if he had previously used to domain. It just so happens that he used the domain a while back and none of the URLs were ever redirected or removed from the index. I've been setting up redirects for all of the 404s appearing in Webmaster tools but we took a pretty decent hit in rankings for February. Could those errors (72 in total) been partially if not completely responsible for the hit in rankings? All other factors have been constant so that lead me to believe these errors were the culprits.
Intermediate & Advanced SEO | | TheOceanAgency0 -
202 error page set in robots.txt versus using crawl-able 404 error
We currently have our error page set up as a 202 page that is unreachable by the search engines as it is currently in our robots.txt file. Should the current error page be a 404 error page and reachable by the search engines? Is there more value or is it a better practice to use 404 over a 202? We noticed in our Google Webmaster account we have a number of broken links pointing the site, but the 404 error page was not accessible. If you have any insight that would be great, if you have any questions please let me know. Thanks, VPSEO
Intermediate & Advanced SEO | | VPSEO0 -
Google Webmaster Tools Sitemap errors for phantom urls?
Two weeks ago we changed our urls so the correct addresses are all lowercase. Everything else 301 redirects to those. We have submitted and made sure that Google has downloaded our updated sitemap several times since. Even so, Webmaster Tools is reporting 33000 + errors in our sitemap for urls that are no longer in our sitemap and haven't been for weeks. It claims to have found the errors within the last couple of days but the sitemap has been updated for a couple of weeks and has been downloaded by Google at least three times since. Here is our sitemap: http://www.aquinasandmore.com/urllist.xml Here are a couple of urls that Webmaster Tools says are in the sitemap: http://www.aquinasandmore.com/catholic-gifts/Caroline-Gerhardinger-Large-Sterling-Silver-Medal/sku/78664
Intermediate & Advanced SEO | | IanTheScot
Redirect error unavailable
Oct 7, 2011
http://www.aquinasandmore.com/catholic-gifts/Catherine-of-Bologna-Small-Gold-Filled-Medal/sku/78706
Redirect error unavailable
Oct 7, 20110 -
Does Google penalize for having a bunch of Error 404s?
If a site removes thousands of pages in one day, without any redirects, is there reason to think Google will penalize the site for this? I have thousands of subcategory index pages. I've figured out a way to reduce the number, but it won't be easy to put in redirects for the ones I'm deleting. They will just disappear. There's no link juice issue. These pages are only linked internally, and indexed in Google. Nobody else links to them. Does anyone think it would be better to remove the pages gradually over time instead of all at once? Thanks!
Intermediate & Advanced SEO | | Interesting.com0