New website server code errors
-
I launched a new website at www.cheaptubes.com and have recovered my search engine rankings as well after penguin & panda devestation. I'm continuing to improve the site but moz analytics is saying I have 288 medium issues and i see the warning "45% of site pages served 302 redirects during the last crawl". I'm not sure how to fix this. I'm on WP using Yoast SEO so all the 301's I did are 301's not 302's. I do have SSL, could it be Http vs Https?
-
Hi
as this is already a post which is markdr as answered it's better to make a new question and to reference this one (else nobody will notice). I would however strongly advice to ask this question as well on a more technical forum like stackoverflow.com
Dirk
-
Can someone please help me! the code that was suggested either didn't work or I'm not putting it in right as it broke my site and I have no idea how to fix it. This past weekend I put W3 cache on it and tinypng to compress my images and optimize the site. Google Page speed said my website is loading 30% faster but now I'm losing rankings again and the Moz crawl now says there are 618 pages with redirects compared to 300 last week. I didn't redirect any pages last weekend. Is it possible that the W3 cache did it? I was trying to fix the issues below from pagespeed which is why I put the W3 cache on it and did the image optimization.
Should Fix:
Eliminate render-blocking JavaScript and CSS in above-the-fold content
Optimize images
Consider Fixing:
Minify JavaScript
Leverage browser caching
Minify CSS
Minify HTML
-
Hi,
I am not really an expert in rewrite rules but normally the syntax is
RewriteCond: Condition
RewriteRule: If condition is met - use this rule to redirectThe order in which the rules are put is important - in the example above when putting first the redirect to https and then the rule that non-www should be redirect to http you are redirecting back to http instead of https.
So your htaccess would look like this (delete the italic part or replace the existing code by the italic version) - no guarantee it's going to work but worth a try. If not working - you could try to put your question on stackoverflow (regex & redirects are way to technical for most of us)
BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L] => by default all url's in WP are redirected to index.php</ifmodule>RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d=> if file (f) or directory (d) are not found)
RewriteRule . /index.php [L] => redirect to root index
RewriteEngine On => the rewrite engine is already on - no need to put this a second timeRewriteCond %{HTTP_HOST} !^www. => if not www in request
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] => redirect to www(you could also try this rule:
RewriteCond %{HTTP_HOST} !^www.example.com$ [NC]
RewriteRule .? http://www.example.com%{REQUEST_URI} [R=301,L]with example.com replaced by your site)
RewriteCond %{HTTPS} !=on => if not https
RewriteRule ^.*$ https://%{SERVER_NAME}%{REQUEST_URI} [R,L] => redirect to https(You could also try this rule RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} )END WordPress
-
i have tried again to insert the code and it broke my site again. I was able to edit the same file to leverage browser caching successfully so I don't know what i'm doing wrong. I tried movng this code to right below RewriteEngine On and before the Rewrite Base. I tried it at the end where I tried before, no luck. the file now looks like
BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]Browser Caching
FileETag MTime Size
<ifmodule expires_module="">ExpiresActive on
ExpiresDefault "access plus 1 week"</ifmodule>END WordPress</ifmodule>
Does anyone have suggestions of how to implement Andy & Dirks ideas?
-
Hi Andy
When I tried to do as you suggested it gave me an internal server error. Please see the code below from .htaccess and the server error. I took it out for now.
BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule ^.$ https://%{SERVER_NAME}%{REQUEST_URI} [R,L]
RewriteCond %{HTTP_HOST} !^www. RewriteRule ^(.)$ http://www.%{HTTP_HOST}/$1 [R=301,L]</ifmodule>END WordPress
Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.
Please contact the server administrator, webmaster@cheaptubes.com and inform them of the time the error occurred, and anything you might have done that may have caused the error.
More information about this error may be available in the server error log.
Additionally, a 500 Internal Server Error error was encountered while trying to use an ErrorDocument to handle the request.
-
No problem at all. Let me know if you get stuck.
-Andy
-
Thanks Andy. I had no idea how to fix. I will give these a try today as soon as I fix a cart shipping glitch.
-
Thank you Dirk. That must be what moz sees too. I'm new to WP & worked with a developer early on who turned out to be just kinda coasting thru easy 3-5 page sites but not very good with ecommerce sites like mine. I've spent months finishing and fixing it. if you want a laugh, my old site until 3 weeks ago was still in frontpage and that's why (along with outdated SEO methods) i got hit hard by algorithm updates. This site was built on another server and moved. I've still been fixing outdated links to the dev server where i find them. I will try your suggestion today
-
Hi,
As Dirk said, you have a lot of http & https issues so you should fix these with a bit of .htaccess code, as below.
`RewriteEngine On` RewriteCond %{HTTPS} !=on RewriteRule ^.*$ https://%{SERVER_NAME}%{REQUEST_URI} [R,L]
You also have some non-www to www issues. You can fix these in .htaccess at the same time...
`RewriteCond %{HTTP_HOST} !^www\.` RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]
<code class="htaccess" title="in your .htaccess file">You should find this fixes a lot of your issues. Also check in your Wordpress general settings that the site is set to www.cheaptubes.com for both instances.
Let me know how you get on with these.
-Andy</code>
-
Hi,
The 302's are coming from redirects from https to http - example: https://www.cheaptubes.com/product-tag/monlayer-graphene-on-cu/ is redirecting to http://www.cheaptubes.com/product-tag/monlayer-graphene-on-cu/ with a 302 (you seem to have a lot of these redirects)
There also seems to be an issue with your url's. You have quite a large number of 4xx errors - mainly caused by trying to call .js files that normally would reside on the root & you try to call on a product page - example: http://www.cheaptubes.com/product/f-functionalized-graphene-nanoplatelets/js/foundation.min.js - while this file is actually on http://www.cheaptubes.com/wp-content/themes/cheaptubes/js/foundation.min.js - so I guess something is wrong with the way you construct your internal links.
Hope this helps,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help on how to test the DA, PA of the website
Please help me. How to check the DA, PA of the website https://toolaim.com/ . I want to know any quality website to take care of it more. But at present I do not know any forum quality is good and trust. Thank you
Reporting & Analytics | | gogoanimetp0 -
Crawl errors for pages that no longer exist
Hey folks, I've been working on a site recently where I took a bunch of old, outdated pages down. In the Google Search Console "Crawl Errors" section, I've started seeing a bunch of "Not Found" errors for those pages. That makes perfect sense. The thing that I'm confused about is that the "Linked From" list only shows a sitemap that I ALSO took down. Alternatively, some of them list other old, removed pages in the "Linked From" list. Is there a reason that Google is trying to inform me that pages/sitemaps that don't exist are somehow still linking to other pages that don't exist? And is this ultimately something I should be concerned about? Thanks!
Reporting & Analytics | | BrianAlpert780 -
Why only a few pages of my website are being indexed by google
Our website www.navisyachts.com has in its sitemap over 3000 pages of information, and this is all unique content written by our team. Now Google Webmaster central shows only 100 urls indexed from 3500 submitted. Can you help me understand why and how I can fix this issue? The website has 4 years old, is a Joomla 3.3 up to date. It has part of the content in the Joomla core content systems and part in K2. Thank you. Pablo
Reporting & Analytics | | FWC_SEO0 -
Sudden Increase In Number of Pages Indexed By Google Webmaster When No New Pages Added
Greetings MOZ Community: On June 14th Google Webmaster tools indicated an increase in the number of indexed pages, going from 676 to 851 pages. New pages had been added to the domain in the previous month. The number of pages blocked by robots increased at that time from 332 (June 1st) to 551 June 22nd), yet the number of indexed pages still increased to 851. The following changes occurred between June 5th and June 15th: -A new redesigned version of the site was launched on June 4th, with some links to social media and blog removed on some pages, but with no new URLs added. The design platform was and is Wordpress. -Google GTM code was added to the site. -An exception was made by our hosting company to ModSecurity on our server (for i-frames) to allow GTM to function. In the last ten days my web traffic has decline about 15%, however the quality of traffic has declined enormously and the number of new inquiries we get is off by around 65%. Click through rates have declined from about 2.55 pages to about 2 pages. Obviously this is not a good situation. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer is examining the issue. They think there may be some tie in with the installation of GTM. They are noticing an additional issue, the sites Contact Us form will not work if the GTM script is enabled. They find it curious that both issues occurred around the same time. Our domain is www.nyc-officespace-leader. Does anyone have any idea why these extra pages are appearing and how they can be removed? Anyone have experience with GTM causing issues with this? Thanks everyone!!!
Reporting & Analytics | | Kingalan1
Alan1 -
Google Webmaster indicates robots.text access error
Seems that Google has not been crawling due to an access issue with our robots.txt
Reporting & Analytics | | jmueller0823
Late 2013 we migrated to a new host, WPEngine, so things might have changed, however this issue appears to be recent. A quick test shows I can access the file. This is the Google Webmaster Tool message: http://www.growth trac dot com/: Googlebot can't access your site January 17, 2014 Over the last 24 hours, Googlebot encountered 62 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 8.8% Note the above message says 'over the last 24 hours', however the date is Jan-17 This is the response from our host:
Thanks for contacting WP Engine support! I looked into the suggestions listed below and it doesn't appear that these scenarios are the cause of the errors. I looked into the server logs and I was only able to find 200 server responses on the /robots.txt. Secondly I made sure that the server wasn't over loaded. The last suggestion doesn't apply to your setup on WP Engine. We do not have any leads as to why the errors occurred. If you have any other questions or concerns, please feel free to reach out to us. Google is crawling the site-- should I be concerned? If so, is there a way to remedy this? By the way, our robots file is very lean, only a few lines, not a big deal. Thanks!0 -
Google Analytics tracking code for completely new website
Hello! I use Google Analytics to monitor our company's websites. In the coming weeks we will roll out a brand new website for one of our brands to replace its current site. The domain name will remain the same. My question is, should I use the same tracking code that's been used on the existing site for the new site, or is there a benefit to creating a new tracking code? Thanks in advance!
Reporting & Analytics | | SmileMoreSEO
Erik0 -
When i first add my url to seomoz then i had a general report of all the faults my website had in SEO and suggestions where can i find it now , i cant find it ?!
when i first add my url to seomoz then i had a general report of all the faults my website had in SEO and suggestions where can i find it now , i cant find it ?!
Reporting & Analytics | | fireproductsuk0 -
Spider 404 errors linked to purchased domain
Hi, My client purchased a domain which based on the seller "promising lots of traffic". Subsequent investigation showed it was a scam and that the seller had been creative in Photoshop with some GA reports. Nevertheless, my client had redirected the acquired domain to their primary domain (via the domain registrar). From the period on which the acquired domain was redirected to the point when we removed the redirect, the web log files had a high volume of spider/bot 404 errors relating to an online pharmaacy - viagra, pills etc. The account does not seem to have been hacked. No additional files are present and the rest of the logs seem normal. As soon as the redirect was removed the spider 404 errors stopped. Aside from the advice about acquiring domains promising traffic which I've already discussed with my client, does anybody have any ideas about how a redirect could cause the 404 errors? Thanks
Reporting & Analytics | | bjalc20110