Would there be any benefit to creating multiple pages of the same content to target different titles?
-
Obviously, the duplicated pages would be canonical, but would there be a way of anchoring a page land by search term entry?
For example:
If you have a site that sells cars you could use this method but have a page that has (brand) cars for sale, finance options, best car for a family, how far will the (brand) car go for on a full tank and so on? Then making all the information blocks h2's but using the same H2s for the duplicated page titles.
Then it gets complicated, If someone searches "best car for a family" and the page title for the duplicated page is clicked how would you anchor this user to the section of the page with this information?
Could there be a benefit to doing this or would it just not work?
-
I've tried for the sake of science and didn't work.
-
Thanks, Gaston,
I thought this might be the case.
I was just interested to see if anyone had a way around this, as targeting the same page with different titles could be interesting.
-
Hi there,
Hope you're well.No, this tactic won't help you. Even, you'll probably be penalized and/or those pages won't rank properly.
Also, when canonicalizing, only one page will be shown in google, not those canonicalized.Hope it helps.
Best luck,
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Primary keyword in every page title of website
Hi all, We can see many website page titles are filled with "brand name & primary keyword" at suffix. Just wondering how much this gonna help. Or can we remove "primary keyword" from other non-relevant pages and limit the same to important pages to rank well? Thanks
Algorithm Updates | | vtmoz0 -
301'ing old (2000), high PR, high pages indexed domain
Hi, I have an old (2000), very high PR, 20M+ pages indexed by goog domain which... got adsense banned. The domain has taken a few hits over the years from penguin/panda, but come out pretty well compared to many competitors. The problem is it was adsense banned in the big adsense acct ban of 2012 for invalid activity. No, I still have no idea what the issue was. I'd like to start using a new domain if I can safely get goog to pass the PR & indexing love so I can run adsense & Adx. What are your initial thoughts? Am I out of my mind to try?
Algorithm Updates | | comfortsteve1 -
Setup WordPress with www in General -> Settings to get benefits of old links or does it matter?
Hello, I looked through many other Q&A and couldn't find this answer exactly... We build all of our client's sites on WordPress which automatically assign the new websites with no www. at the beginning. Recently one of our customers was upset because his new site (non-www) had only 3 links to it and his old www.domain.com site had 548. Is the simplest way to fix this to go into the WordPress Settings -> General and just change the WordPress Address and Site Address to the www version? Does it even matter or does WordPress tell Google to look at both versions. We don't see any SERP impact by having the non-www version up, but if it is an easy fix to get the 548 link credit I'll take it! Reason I'm concerned is I do see the difference in OSE and would prefer to have 548 links vs. 3 also! Any advice would be appreciated. Thanks community!
Algorithm Updates | | Tosten0 -
Do scraped or borrowed articles with my links still pass page rank?
I wrote some articles for Ezine Articles a few years back and i still see links in the ose to my site that are from these articles that were borrowed from the Ezine Articles bank. Do the links in these articles still count toward my site including link juice and anchor text or does google discount them as duplicate content? I was told that Google counts these links for about 3 weeks and then discounts them as duplicate content so it's like they don't exist. Any truth to this or should i make the articles on my site available for people to copy and paste into their blogs as long as they keep my links intact? Thanks, Ron
Algorithm Updates | | Ron100 -
Why is my domain authority (and page authority) plummeting?
In June our domain authority was at a 41. In July we were 38 and ever since then our domain authority is gradually getting worse and worse. We went from a 33 to a 29 in one week! Possible explanations include: Maybe the SEO we hired (for a few months in late 2011) added our domain to some less-than-awesome directories The 301 redirects on our home page are hurting us somehow Duplicate content for URL's with different capitalization (IE: /pages/aboutus and /Pages/AboutUs) Can someone please point me in the right direction? Which of the above possibilities would likely impact domain/page authority? Any other ideas as to why this might be happening? Any suggestions for improving our domain or page authority? Thanks for the help!
Algorithm Updates | | MichaelBrown550 -
How do I rank multiple pages for my busness/domain name?
When someone searches for our business's name (which is also the domain name) we have one listing (with sitelinks) at the top - however I would also like to rank 2nd, 3rd and 4th for this term. Any suggestions on how this might be done? Thanks.
Algorithm Updates | | CaBStudios0 -
Content below the fold and Panda Update
Hi I was at the linklove conference and I heard some worrying stories about the way content is formatted on a page being a factor in ehow has avoided being slapped. It was the first time I had heard the expression "below the fold..." I am producing some very sexy SERP's results and other sexier metrics are up too but I am concerened that thefurnituremarket.co.uk has a ton of images on the home page and the nice content is below all of them.. firstly is this content..."below the fold"? secondly I know the site is old but do you think when this panda update hits the UK... were will be penalised for the look of the site.. I know there was talk yesterday at the conference of coming up woth a tool to check this out... my gut says that this will be a factor... sooner rather than later hence I am looking at magento and how we can skin it to look nice and present products better.. I would be really interested to know what exactly is "below the fold" on the furnituremarket.co.uk and some thoughts on the whole ehow formatting issue..
Algorithm Updates | | robertrRSwalters0