Web designer doesn't see duplicate pages or issues??? Help!
-
I have a website that has duplicate pages, and I want to fix them. I told my web guy and his response was:
"we do not see any issues on our end. It is not hurting any ranking in the 3 major searches. We have no duplicate pages"
Here's the site http://www.wilkersoninsuranceagency.com/
Here's the home page again http://www.wilkersoninsuranceagency.com/index.php
Here's what MOZ say: Please see attached image.
I'm not sure what to tell him, as i'm not a web person, but MOZ is telling me we have issues on the site and I feel like they should be fixed??
-
Best of luck, feel free to send me an email (or message on here) if you need me to go any further into detail.
-
I'm now well armed to go back to him and ask again for him to make these changes.
Hopefully he'll listen to me this time! Though I'm now thinking I need a new guy...
Thanks Thomas!
-
Thank you Laura, you've given me a clearer understanding of what I need to say to him. I do think I need a new guy though..
-
Your web developer is wrong. You do have multiple version of the homepage
You have the following:
http://wilkersoninsuranceagency.com/index.php
http://wilkersoninsuranceagency.com/
http://www.wilkersoninsuranceagency.com/index.php
http://www.wilkersoninsuranceagency.com/
I would first of all get your web developer to do a http:// to http://www. redirect across your whole site, so whenever any user lands on your site they atleast get put onto the same version.
As for with/without index.php you are getting these duplicates, I would ask the web developer to remove any references to index.php and just refer specifically to http://www.wilkersoninsuranceagency.com/ whilst also setting a canonical within the index.php to http://www.wilkersoninsuranceagency.com/
If your developer tries to fight this I would look for a new developer.
Just had a quick look through the site, your developer looks to be being lazy. https://gtmetrix.com/reports/wilkersoninsuranceagency.com/I6khUBrH under serve scaled images, they haven't bothered to resize it and that's making your website load slower.
http://wilkersoninsuranceagency.com/images/img0018.png is just a lazy way of naming images.
-
Yes, this is a duplicate problem when it comes to SEO. This is an issue for two reasons:
1. Search engines see these as two different URLs even though they point to the same page. They probably understand that it's the same page but not necessarily. The correct way to handle this would be to 301-redirect http://www.wilkersoninsuranceagency.com/index.php to http://www.wilkersoninsuranceagency.com/. (If your designer doesn't know how to do this, you need a new web designer.)
2. This is also a problem because any backlinks may be split between the two URLs even though they are the same page. This will be fixed by the 301-redirect as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Potential duplicate content issue?
We have a category on our website for PVC rolls to buy as standard 50m rolls (this includes 15 products in the category). We're also releasing PVC rolls to buy per metre (10m roll/25m roll etc...), again with 15 products, which we are adding as a separate category as it makes more sense for our customers and removes the risk of having too many options. Would using the same description be bad practice for SEO? The product is exactly the same just available in different roll sizes, but we definitely do not want to combine categories as it doesn't work for our customers. Any help or suggestions would be appreciated, thanks.
On-Page Optimization | | RayflexGroup0 -
I still don't understand how rel=canonical works. Help?
So here's the deal. I write for many different outlets. I also have many different pages on my blog that have duplicates (authorized, of course). On my blog, I have many different pages that redirect to "the original" content. I've only recently discovered the existence of rel=canonical. However I don't understand how it works. I have very specific questions. Can anyone help? If, on my blog, I have a blog post that's the original. And another website has the same content, used with authorization. If I want to tell search engines that the original content is on MY blog, what can I do? Is the only solution to ask the owner of the other blog to add a rel=canonical in the header of the specific post? If, on my blog, I have a blog post that's NOT the original. Do I simply add rel=canonical to the header, then add a link to the original in the body? If, on my blog, I have THE FIRST 300 WORDS of a blog post, then add a link saying "to read the whole article, click here" with a link pointing to the original, do I need to have a rel=canonical tag somewhere? Does it HAVE to be in the header? Can rel=canonical be used in the - What penalties are included with having duplicate content of my work everywhere on the web? I've been trying to find specifics, but can't. Thanks for the help. I'm quite confused, as you can see.
On-Page Optimization | | cedriklizotte0 -
Page Layout Updates and Mobile Pages with Ads
I have been trying to do some research on the Page Layout Algorithm and Top Heavy Ads and much of what I read does not mention about mobile pages as apposed to desktop. I am curious if the Page Layout updates can be effected by mobile pages as well and if there is any good articles on this subject. Also is this Algorithm been incorporated into its regular algorithm or do we still have to wait for refreshes to see the impact? Cesar
On-Page Optimization | | cbielich0 -
Why Isn't Google Authorship Showing My Picture?
I have several clients and the Google Authorship images used display in the search results for all of them. About a month ago all of the images disappeared, however it still displays "by <name>, indicating that Google Authorship is working -- it just doesn't show the image (see screenshots). The image follows the guidelines, and we've got the rel author tag in place, with a link back to Google. </name> When I use the Google Structured Data Testing Tool it shows that authorship is properly functioning. I'm completely stumped. Does anyone have any ideas why this may not be working? Here's two examples of the sites with Authorship not working properly (screenshots below): criminalattorneylongislandny.com
On-Page Optimization | | socialfirestarter
https://dl.dropboxusercontent.com/u/3786946/Screen Shot 2014-01-03 at 12.53.10 PM.png
https://dl.dropboxusercontent.com/u/3786946/Screen Shot 2014-01-03 at 12.44.12 PM.png attorneytonyadderley.com https://dl.dropboxusercontent.com/u/3786946/Screen Shot 2014-01-03 at 12.52.36 PM.png
https://dl.dropboxusercontent.com/u/3786946/Screen Shot 2014-01-03 at 12.52.52 PM.png0 -
How to improve On-Page Grades for Top Ranking Pages
please help me - i dont know or understand how to improve on-Page Grades for Top Ranking Pages
On-Page Optimization | | pwwukpw0 -
Source page leading to a 404 pages in reports
Hi everybody, I wonder how to find and quickly correct 404 errors in my crawl reports : SeoMoz says me "http://domain.com/this-page-is-dead" is 404, but I can't figure out a source page where a link to that url appears. I tried a google link:http://domain.com/this-page-is-dead request, with no more luck. I imagine the trick is trivial, but I need it 🙂 Moreover, why do not show a list of pages referring to this 404 page on reports ? Thanks, Loïc
On-Page Optimization | | mandinga0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0 -
Potential Duplicate Title Tags On Sibling Pages
Edit I'll take the fall on this one, seems I could have asked my quesiton in a more clear manner. I was cruising other questions and finding a whole of answers that I suspect were not truly intended to help, but maybe help and earn Mozpoints. Wasn't fair of me to label those answering here with that. I will work better on the wording of my questions! 🙂 Edit Either I am asking my question poorly or I am learning there may be a rush to get points by throwing up any old answer...it very well may be the former which I am open to feedback on. Each page is to stand alone and hopefully rank well for the neighbourhood name and in conjunction with another relevant keyword phrase. There is no 'duplicate' version of any pages. * On a site there are numerous pages that provide real estate listings broken down by neighbourhood. Each containing similar content, a abbreviated version of the listings, often spanning 2 or 3 pages. These are 3rd level pages. Properties->Calgary Neighbourhoods->Evanston The title tags created are: Evanston Homes For Sale - NW Calgary Real Estate Panorama Hills Home For Sale - NW Calgary Real Estate Etc. for about 15 or so pages. Then they start again for another area of the city: Sagewood Homes For Sale - Airdrie Real Estate Woodside Homes For Sale - Airdrie Real Estate At this point there is no text on the actual page outside of the listings...an example of similar listings on another site - http://www.experiencerealtygroup.com/BaturynandDunluceHomes.ubr Do you think the SE's will see these as 'proper' use of the Title Tag or duplicate or other practices they tend to frown upon? It is a logical way of creating the title and obviously creating a unique version for each page would not only be tough to scale on some sites with 100's of these pages, they would become a little silly and not much use to the searcher in the SERPs Thanks for any help!
On-Page Optimization | | kyegrace1