Thank you Andreas,
But is the redirect HTTP Header reading correct or not?
Also, site:seomoz.com is showing a very small number of pages, while my clients old website is all appearing on Google. Every single page.
Issa
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Job Title: Digital Director
Company: IQ Digital
Favorite Thing about SEO
SEO never stops changing
Thank you Andreas,
But is the redirect HTTP Header reading correct or not?
Also, site:seomoz.com is showing a very small number of pages, while my clients old website is all appearing on Google. Every single page.
Issa
Hi there,
You might have experienced this before but for me this is the first.
A client of mine moved from domain A (www.domainA.com) to domain B (www.domainB.com). 301 redirects are all in place for over a year. But the old domain is still showing in Google when you search for "site:domainA.com"
The HTTP Header check shows this result for the URL https://www.domainA.com/company/cookie-policy.aspx
HTTP/1.1 301 Moved Permanently =>
Cache-Control => private
Content-Length => 174
Content-Type => text/html; charset=utf-8
Location => https://www.domain_B_.com/legal/cookie-policy
Server => Microsoft-IIS/10.0
X-AspNetMvc-Version => 5.2
X-AspNet-Version => 4.0.30319
X-Powered-By => ASP.NET
Date => Fri, 15 Mar 2019 12:01:33 GMT
Connection => close
Does the redirect look wrong? The change of address request was made on Google Console when the website was moved over a year ago.
Edit: Checked the domainA.com on bing and it seems that its not indexed, and replaced with domainB.com, which is the right. Just Google is indexing the old domain!
Please let me know your thoughts on why this is happening.
Best,
Hi Radi,
I don't see any issues with this as long as every suburb page will have unique and quality content as opposed to duplicate and thin content that is usually caused by creating unnecessary pages just to include additional keywords.
So say you have "Area A" which includes "Suburb A" and "Suburb B".
If Area A has so many listings, on domain.com/area-a/ then you can split that into
Suburb A domain.com/area-a/plastic-plants-subarb-a
Suburb B domain.com/area-a/plastic-plants-subarb-b
Listing should be unique and of high quality on each of these pages, none should be repeated.
I hope this helps
Issa
Hi David,
This is completely your choice. Your website's SEO and overall ranking wont be affected by block SEMRush for example.
The only drawback I can think of is that you wont be able to use this tool for your own domain research, which can be helpful if you're looking for ranking stats, or click share from search engines. I would hate to not be able to research my client's websites regularly, as this is the kind of intelligence I use to a) understand Google's and Users behaviour to find my website. b) discover technical and SEO issues with my websites.
So although you can go a head and do it, I would say you should always be ready to unblock them when needed. Unfortunately SEM Rush does not update their database so regularly, so you wouldn't be able to predict when you should unblock their robot for information.
I hope I could help
Issa
Hi usDragons,
Having too many crawl errors is not healthy. Usually a few number of pages are deleted every now and then, but having hundreds or thousands of 404s means something is wrong with the website, and from your description it's obvious that something is wrong. In fact, redirecting unnatural/thin content pages to your website can harm it, as its in a way links that send traffic (through 301 redirects) to your website, so you need to disavow these.
Because you have no control over the website, you should treat it as an external site that is spamming you. So don't think of it as a site that you own but have no access to.
The disavow tool requires you to create a .TXT file that have an explanation of why you disavow each group of domains/links. So you should explain that these are bad links that send you traffic, and you tried to "request" deleting these links and you got no help from whoever controls it, which i guess is true in your case.
Try to explain everything in your comments (in the .TXT file) (See attached)
Good luck and I hope I could help in anyway.
Hi,
You should find your answer in this link: https://developers.google.com/analytics/devguides/collection/analyticsjs/enhanced-ecommerce#overview
But essentially you will have to use something like:
ga('ec:addImpression',{ // Provide product details in an impressionFieldObject.
'id':'P12345', // Product ID (string).
'name':'Android Warhol T-Shirt',// Product name (string).
**'category':'Apparel/T-Shirts', // Product category (string).**
'brand':'Google', // Product brand (string).
'variant':'Black', // Product variant (string).
'list':'Search Results', // Product list (string).
'position':1, // Product position (number).
'dimension1':'Member' // Custom dimension (string).});
But you need to specify the Field Object in the top line. Read the link I sent you and should be all self explanatory.
I hope this helps
Issa
Hi there,
Seems to me that you should follow the standard process when you have unnatural links. You should:
I know its not straight froward nor fast, but thats how you maintain the public link profile of any website since the Penguin Updates started.
I hope it helps
Great thank you.
Will have a read.
Still though, with the situation above, is it OK for this industry to have such duplicate content and what to do about it if its not.
Thanks
Hi everyone,
It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason.
The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content.
Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed).
The questions here are:
Thank you in advance.
Hi there,
Seems to me that you should follow the standard process when you have unnatural links. You should:
I know its not straight froward nor fast, but thats how you maintain the public link profile of any website since the Penguin Updates started.
I hope it helps
Hi Radi,
I don't see any issues with this as long as every suburb page will have unique and quality content as opposed to duplicate and thin content that is usually caused by creating unnecessary pages just to include additional keywords.
So say you have "Area A" which includes "Suburb A" and "Suburb B".
If Area A has so many listings, on domain.com/area-a/ then you can split that into
Suburb A domain.com/area-a/plastic-plants-subarb-a
Suburb B domain.com/area-a/plastic-plants-subarb-b
Listing should be unique and of high quality on each of these pages, none should be repeated.
I hope this helps
Issa
Hi,
You should find your answer in this link: https://developers.google.com/analytics/devguides/collection/analyticsjs/enhanced-ecommerce#overview
But essentially you will have to use something like:
ga('ec:addImpression',{ // Provide product details in an impressionFieldObject.
'id':'P12345', // Product ID (string).
'name':'Android Warhol T-Shirt',// Product name (string).
**'category':'Apparel/T-Shirts', // Product category (string).**
'brand':'Google', // Product brand (string).
'variant':'Black', // Product variant (string).
'list':'Search Results', // Product list (string).
'position':1, // Product position (number).
'dimension1':'Member' // Custom dimension (string).});
But you need to specify the Field Object in the top line. Read the link I sent you and should be all self explanatory.
I hope this helps
Issa
I don't think that I know it all. So please correct me when I'm wrong
Looks like your connection to Moz was lost, please wait while we try to reconnect.