Web designer doesn't see duplicate pages or issues??? Help!
-
I have a website that has duplicate pages, and I want to fix them. I told my web guy and his response was:
"we do not see any issues on our end. It is not hurting any ranking in the 3 major searches. We have no duplicate pages"
Here's the site http://www.wilkersoninsuranceagency.com/
Here's the home page again http://www.wilkersoninsuranceagency.com/index.php
Here's what MOZ say: Please see attached image.
I'm not sure what to tell him, as i'm not a web person, but MOZ is telling me we have issues on the site and I feel like they should be fixed??
-
Best of luck, feel free to send me an email (or message on here) if you need me to go any further into detail.
-
I'm now well armed to go back to him and ask again for him to make these changes.
Hopefully he'll listen to me this time! Though I'm now thinking I need a new guy...
Thanks Thomas!
-
Thank you Laura, you've given me a clearer understanding of what I need to say to him. I do think I need a new guy though..
-
Your web developer is wrong. You do have multiple version of the homepage
You have the following:
http://wilkersoninsuranceagency.com/index.php
http://wilkersoninsuranceagency.com/
http://www.wilkersoninsuranceagency.com/index.php
http://www.wilkersoninsuranceagency.com/
I would first of all get your web developer to do a http:// to http://www. redirect across your whole site, so whenever any user lands on your site they atleast get put onto the same version.
As for with/without index.php you are getting these duplicates, I would ask the web developer to remove any references to index.php and just refer specifically to http://www.wilkersoninsuranceagency.com/ whilst also setting a canonical within the index.php to http://www.wilkersoninsuranceagency.com/
If your developer tries to fight this I would look for a new developer.
Just had a quick look through the site, your developer looks to be being lazy. https://gtmetrix.com/reports/wilkersoninsuranceagency.com/I6khUBrH under serve scaled images, they haven't bothered to resize it and that's making your website load slower.
http://wilkersoninsuranceagency.com/images/img0018.png is just a lazy way of naming images.
-
Yes, this is a duplicate problem when it comes to SEO. This is an issue for two reasons:
1. Search engines see these as two different URLs even though they point to the same page. They probably understand that it's the same page but not necessarily. The correct way to handle this would be to 301-redirect http://www.wilkersoninsuranceagency.com/index.php to http://www.wilkersoninsuranceagency.com/. (If your designer doesn't know how to do this, you need a new web designer.)
2. This is also a problem because any backlinks may be split between the two URLs even though they are the same page. This will be fixed by the 301-redirect as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How come canonicalized pages are showing in the Duplicate Titles report?
I am currently removing all duplicate titles from my site via title tag changes, 301's, and in some instances, canonical tags. I'm confused about why the Moz report spit out pages with duplicate titles that are canonicalized to other pages. Does Google actually consider these pages as having duplicate titles? Or is Roger Mozbot not intuitive enough to to disregard those pages?
On-Page Optimization | | StevenLevine0 -
Duplicate URL's in Sitemap? Is that a problem?
I submitted a sitemap to on Search Console - but noticed that there are duplicate URLs, is that a problem for Google?
On-Page Optimization | | Luciana_BAH0 -
Description tag not showing in the SERPs because page is blocked by Robots, but the page isn't blocked. Any help?
While checking some SERP results for a few pages of a site this morning I noticed that some pages were returning this message instead of a description tag, A description for this result is not avaliable because of this site's robot.s.txt The odd thing is the page isn't blocked in the Robots.txt. The page is using Yoast SEO Plugin to populate meta data though. Anyone else had this happen and have a fix?
On-Page Optimization | | mac22330 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
Too Many On-Page Links, Can You HELP???
This is the best architecture I found to help my visitors find there furnace filter size. Does it hurt my SEO? Index page has, 210 links and most other pages has, 190 links. Thank you, BigBlaze
On-Page Optimization | | BigBlaze2050 -
Duplicat page content issue I don't know how to solve
I've got a few pages (click here to see the fist on with the others as side bar links). They are all thumbnail pages of different products. The tiles are pretty different but the page content is virtually the same for all of them as is the meta description tag. I'm getting error's on the SEOmoz crawl for those pages. I know the meta tag shouldn't be a problem in SEO but is the content of the page going to cause me issues? Are the error messages from SEOmoz a result of the page content or the meta description? The pages are very similar but they are different enough that I want to separate them onto different pages. There would be too many links on that single page as well if all the thumbs where on the same page. Should I just ignore the error messages?
On-Page Optimization | | JAARON0 -
DUPLICATE PAGE TITLE ISSUE
Hi We have 25 pages with a download form on it. People arrive at the page through a ink with optimised anchor text which sits on the information pages. As there is no information on these pages we do not need them to be optimised so the developer has given all the download pages exactly the same page title. Although the pages in themselves are not significant would this effect the way Google viewed the whole site, and would it pay to make each one unique or doesn't it really matter. Alternatively, is there a better way to handle this? and if so would that ligate the benefit of the anchor text. Thanks
On-Page Optimization | | PH2920 -
Keyword cannabilization ... I just cant face 301'ing good, well aged pages
Hi Mozzers Ive read a little about your views on cannabilization and would like to run my situation by you. I have 2 pages lets say (a) and (b) that rank ok for a main keyword. However (a) desite being nice and old is not ageing well and is starting to slip a little - its getting harder to spread the link juice so Ive been thinking should I ditch page (a) and focus solely on page (b) for this keyword. Page (b) seems to be getting better serp value right now. What I find hard is that page (a) has been around a while (6 years) and I cant bring myself to 301 it assuming thats what you would normally do to avoid cannabilization. But at the end of the day its a business page and if its failing - yet could inject even more bounce into page (b) it must be worth considering. What is the best way forward here..? Im not sure how quick any transition of link juice would take ? Also what to do with the unique content on page (a)? Seems such a shame to just ditch it. Cheers fella's Morch
On-Page Optimization | | Morch0