Pagination Issues
-
Hi,
My press release section is showing pagination issues (duplicate pages)
http://www.me-by-melia.com/press/
I know they are showing 27 duplicate page titles which would be a easy fix. Do you recommend pagination?
Let me know if u see any other issues.
-
Clcik start capture, then when youload page it will list all requests and their status codes
-
what you recommend on how to change the url structure from http://www.me-by-melia.com/index3.html to www.melia-by-melia,com/London?
I try to F12 on I8, and click network. How do you distingiush how many redirects there were?
-
Excellent point as I whole heartily agree!
-
do you mean pass a title in the query string so they have indervidual titles? yes that would be a good idea. Duplicate titles is a waste of prime SEO real estate
if when you choose a different page, the content changes significantly then of cause DO NOT use canonical tags.
-
Your right. I was referring to duplicate titles. Something as simple as that can be fix updating the title tags. Do you recommend I changing the titles of the page in the URL string instead of showing /press26 or /press27?
Good suggestions though!
-
sorry i may of mis-understood.
Are they duplicate content? if so i would do as i suggested,
If they are indervidual press releases, then why are they being reported as duplicate. You need to add enouth content to make sure thet they are seen as indervidual.
I can not load page its seems to be offline or some problem, so i dont understand how you are using pagnation,
-
Will the individual press release pages still get indexed and ranked individually in the search engines?
-
Add http://www.me-by-melia.com/press/"/>
this will tell the SE that no matter what the parameters give credit to http://www.me-by-melia.com/press/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking Issue for New Site
Hi all, I have got a specific SEO challenge. 6 months ago, we started to build an eCommerce site (located in the UK). In order to speed up the site launch, we copied the entire site over from an existing site based in Ireland. Now, the new UK site has been running for 5 months. Google has indexed many pages, which is good, but we can't rank high (position: between 20-30 for most pages). We thought it was because of content duplication in spite of different regions. So we tried to optimize the pages for the UK site to make them more UK-related and avoid content duplication. I've also used schema to tell google it's a UK-based site and set up Google my business and got more local citations. Besides, If you could give me any suggestions, it'd be perfect.
Intermediate & Advanced SEO | | Insightful_Media
Thank you so much for your time and advice.1 -
Main Nav Redirects Issue: Unnecessary Evil or Better Solution
Hi, I'm somewhat stumped on the best course of action for our navigation menu. Our site is "divided" into two areas; informational and transactional. Because of compliance, transnational users have to be geo targeted; therefore, we serve them a specific page (we have 6 different regions: uk, aus, eu, etc). If users visit informational side, it's not geo specific. Example: https://site/charts https://site/uk/money Within our main nav, we don't specify the geo transaction page and use a generic https://site/money/ (page doesn't exist) then when a user clicks that link, we'll detect their location and serve up a 301 redirect to the correct geo page. This has obviously caused a ton load of unnecessary redirects and a waste of powerful link equity from the header of the site. It's been recommended to dynamically change the linked URL in this header based on the location of the user. That sounds good but what about Google? Since we can't detect Google crawler IP, we would have to pick a default geo URL like /uk/money. If we do that, the other regional URLs suffer link equity. How do we minimize redirects and make Google happy for all our geo pages. Hope this makes sense and thanks for your time!
Intermediate & Advanced SEO | | Bragg0 -
Mass Product Page Upload - SEO Issue?
Hi We will be adding a lot of products to our site, in a mass referencing exercise, not all in one go, but 10,000 split into a few loads. This product content won't be duplicate, but the quality of the information may be sparse and not very high. My question is, whether adding a bulk of these pages will reduce the pverall domain authority on our site? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Strange Cross Domain Canonical Issue...
We have 2 identical ecommerce sites. Using 301 is not an option since both are major brands. We've been testing cross domain canonicals for about 2 dozen products, which were pretty successful. Our rankings generally increased. Then things got weird. For the most part, canonicaled pages appeared to have passed link juice since the rankings significantly improved on the other site. The clean URLs (www.domain.com/product-name/sku.cfm) disappeared from the rankings, as they are supposed to, but some were replaced by urls with parameters that Google had indexed (apparently duplicate content). ex: (www.domain.com/product-name/sku.cfm?clicksource?3diaftv). The parametered URLs have the correct canonical tags. In order to try and remove these from Google's index, we: 1. Had the pages fetched in GWT assuming that Google hadn't detected the canonical tage. 2. After we discovered a few hundred of these pages indexed on both sites, we built sitemaps of the offending pages and had the sitemaps fetched. If anyone has any other ideas, please share.
Intermediate & Advanced SEO | | AMHC0 -
Site Redesign Inconsistent Trailing Slash Issue
I'm looking at a site that has implemented trailing slashes inconsistently across multiple pages. For instance:
Intermediate & Advanced SEO | | GrouchyKids
http://www.examplesite.co.uk/ (WITH)
http://www.examplesite.co.uk/product-range (WITHOUT)
http://www.examplesite.co.uk/product (WITHOUT)
http://www.examplesite.co.uk/blog/ (WITH)
http://www.examplesite.co.uk/blog/blog-article/ (WITH) The blog was created later in Wordpress which is one of the reasons why this issue exists. Looking at the inbound links unsurprisingly the lions share go to the home page but lots of other pages have links as well, particularly the product pages, no to many to the blog pages. This pattern is similar in terms of which pages rank, the home page ranks well for a variety of phrases, the product pages also do quite well. I know that ideally the URL's should be identical to the existing site, or if you have to you should 301 redirect old to new. The client wants to switch the whole site over to Wordpress which will be default implement a consistent URL structure across the board, thereby changing at least some of the URL's no matter what I do. I remember a Matt Cutts video that stated that even a 301 redirect will loose a clicks worth of link juice see: https://www.youtube.com/watch?v=Filv4pP-1nw The existing site has a poor UX compared to the new proposed design so this should help us. Has anyone got any experience with a similar issue or any advice about how best to proceed?0 -
How to Avoid Duplicate Content Issues with Google?
We have 1000s of audio book titles at our Web store. Google's Panda de-valued our site some time ago because, I believe, of duplicate content. We get our descriptions from the publishers which means a good
Intermediate & Advanced SEO | | lbohen
deal of our description pages are the same as the publishers = duplicate content according to Google. Although re-writing each description of the products we offer is a daunting, almost impossible task, I am thinking of re-writing publishers' descriptions using The Best Spinner software which allows me to replace some of the publishers' words with synonyms. I have re-written one audio book title's description resulting in 8% unique content from the original in 520 words. I did a CopyScape Check and it reported "65 duplicates." CopyScape appears to be reporting duplicates of words and phrases within sentences and paragraphs. I see very little duplicate content of full sentences
or paragraphs. Does anyone know whether Google's duplicate content algorithm is the same or similar to CopyScape's? How much of an audio book's description would I have to change to stay away from CopyScape's duplicate content algorithm? How much of an audio book's description would I have to change to stay away from Google's duplicate content algorithm?0 -
Concerns about duplicate content issues with australian and us version of website
My company has an ecommerce website that's been online for about 5 years. The url is www.betterbraces.com. We're getting ready to launch an australian version of the website and the url will be www.betterbraces.com.au. The australian website will have the same look as the US website and will contain about 200 of the same products that are featured on the US website. The only major difference between the two websites is the price that is charged for the products. The australian website will be hosted on the same server as the US website. To ensure Australians don't purchase from the US site we are going to have a geo redirect in place that sends anyone with a AU ip address to the australian website. I am concerned that the australian website is going to have duplicate content issues. However, I'm not sure if the fact that the domains are so similar coupled with the redirect will help the search engines understand that these sites are related. I would appreciate any recommendations on how to handle this situation to ensure oue rankings in the search engines aren't penalized. Thanks in advance for your help. Alison French
Intermediate & Advanced SEO | | djo-2836690 -
Index.php canonical/dup issues
Hello my fellow SEOs! I would LOVE some additional insight/opinions on the following... I have a client who is an industry leader, big site, ranks for many competitive phrases, blah blah..you get the picture. However, they have a big dup content/canonical issue. Most pages resolve with and without the /index.php at the end of the URL. Obviously this is a dup content issue but more importantly they SEs sometimes serve an "index.php" version of the page, sometimes they don't, and it is constantly changing which version it serves and the rank goes up and down. Now, I've instructed them that we are going to need to write a sitewide redirect to attempt a uniform structure. Most people would say, redirect to the non index.php version buttttt 1. The index.php pages consistently outperforms the non index.php versions, except the homepage. 2. The client really would prefer to have the "index.php" at the end of the URL The homepage performs extremely well for a lot of competitive phrases. I'd like to redirect all pages to the "index.php" version except the homepage and I'm thinking that if I redirect all pages EXCEPT the homepage to the index.php version, it could cause some unforeseen issues. I can not use rel=canonical because they have many different versions of the their pages with different country codes in the URL..example, if I make the US version canonical, it will hurt the pages trying to rank with a fr URL, de URL, (where fr/de are country codes in the URL depending where the user is, it serves the correct version). Any advice would be GREATLY appreciated. Thanks in advance! Mike
Intermediate & Advanced SEO | | MikeCoughlin0