How to associate content on one page to another page
-
Hi all,
I would like associate content on "Page A" with "Page B". The content is not the same, but we want to tell Google it should be associated. Is there an easy way to do this?
-
Yeah, I'm afraid Chris is right. There's really no way to tell Google to index both pages but then not give them control over which one ranks. Google is naturally going to prefer the full content page, because they want to get people to the best "answer", in a sense.
Truthfully, I think it's a better search user experience, in most cases. Internal search users can travel from the snippet to the post, but search users may get frustrated at going from Google's results to your results, and not straight to the resource. If you force the first step on search users, you may actually increase your bounce rate and harm your overall performance.
-
The problem you may have is duplicate content it's kinda trying to have your cake and eat it unfortunately. You could try to manually index the site but it is a bit of a predicament your in. The next option is you wait and both the pages will get indexed eventually or via a sitemap on webmaster tools. As mentioned would have a slight concern for duplicate content.
-
I thought rel=canonical might be the answer, but from my understanding of it, I thought if we put this on the top of Page B, then Google would ignore the content on Page B and just associate any "link juice" etc on Page B with Page A.
[And we really want to make sure the Google indexes the content on Page B. Any ideas?
Am I misunderstanding rel=canonical?](PageA)
-
Sounds like a job for rel=canonical
You can also submit a sitemap to help Google index.
I would be careful not to make it look like cloaking as that can be bad. You could have the snippet with a read more link and put the read more on the robots.
Couple of idea's for you there. Hope they help.
-
Well it's not 2 sites, it's two pages. So it's a read more page and a full content page. We want Google to index the full content page, so that it's there, but we want our users to go to the page with the "read more" first, before they see that full content page.
So we would like Google to associate the full content page with the read more page, so that the read more page comes up before the full content page.
Thanks!
-
Can you go a bit more in depth ? IS there a reason why you can't just link the two sites ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Different content on pages with the same URL--except one is at www and the other at www2
Hi! I have two pages with unique content on each. However, they have virtually the same URL--except one is a www and the other is a www2. As far as I know, both pages were meant to gain organic traction. How should this situation be handled for SEO purposes? Thanks for any help! ---Ivey
Intermediate & Advanced SEO | | Nichiha0 -
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
Are 1x Event pages considered thin content? Should they be archived or redirected?
Since past event pages will become stale after the event, should they be keep alive and archived with only a link from a couple of places (for instance the main event page and html sitemap). Or should they be "retired" and redirected to the main event page if they are really no longer needed? They would probably be considered thin content because they won't have much traffic and will have very few links pointing to them. Right? Thanks. Inquiring minds want to know... 😉
Intermediate & Advanced SEO | | cindyt-170380 -
Best Approach to Redirect One Domain to Another
So I'm about to migrate one domain to another. Lets say I'm migrating boo.com to foo.com. Boo.com has good organic traffic & has some really well ranked pages. For this reason (I think) I want to send that traffic to some where other than the foo.com homepage. Perhaps a catered landing page. My question is can I redirect some of the specific pages on boo.com to a landing page on foo.com & then redirect the delta to foo.com's homepage? Or am a risking not fully transferring the full credit of one domain to another if I take that approach & therefore I should just redirect one domain to the other in its entirety? Thanks, Rich
Intermediate & Advanced SEO | | RPD0 -
Consistent Ranking Jumps Page 1 to Page 5 for months - help needed
Hi guys and gals, I have a really tricky client who I just can't seem to gain consistency with in their SERP results. The keywords are competitive but what the main issue I have is the big page jumps that happen pretty much on a weekly basis. We go up and down 40 positions and this behaviour has been going on for nearly 6 months.
Intermediate & Advanced SEO | | Jon_bangonline
I felt it would resolve itself in time but it has not. The website is a large ecommerce website. Their link profile is OK in regards to several high quality newspaper publication links, majority brand related anchor texts and the link building we have engaged in has all been very good i.e. content relevant / high quality places. See below for some potential causes I think could be the reason: The on page SEO is good however the way their ecommerce website is setup they have formed a substantial amount of duplicate title tags. So in my opinion this is a potential cause. The previous web developer set-up 301 redirects all to their home page for any 404 errors. I know best practice is to go to the most relevant pages, however could this be a potential issue? We had some server connectivity issues show up in webmasters tools but that was for 1 day about 4 months ago. Since then no issues. they have quite a few 'blocked URLs' in their robots.txt file, e.g. Disallow: /login, Disallow: /checkout/ but to me these seem normal and not a big issue. We have seen a decrease over the last 12 months in Webmasters Tools of 'total indexed web pages' from 5000 to 2000 which is quite an odd statistic. Summary So all in all I am a tad stumped. We have some duplicate content issues in title tags, perhaps not following best practice in the 301 redirects but other than that I dont see any major on page issues, unless I am missing something in the seriousness of what I have listed.
Finally we have also do a bit of a cull of poor quality links, requesting links to be removed and also submitting a 'disavow' of some really bad links. We do not have a manual penalty though. Thoughts, feedback or comments VERY welcome.0 -
Wordpress - Dynamic pages vs static pages
Hi, Our site has over 48,000 indexed links, with a good mix of pages, posts and dynamic pages. For the purposes of SEO and the recent talk of "fresh content" - would it be better to keep dynamic pages as they are or manually create static pages/ subpages. The one noticable downside with dynamic pages is that they arent picked up by any sitemap plugins, you need to manually create a separate sitemap just for these dynamic links. Any thoughts??
Intermediate & Advanced SEO | | danialniazi1 -
Rel canonical on every page, pointing to home page
I've just started working with a client and have been surprised to find that every page of their site (using Concrete5 CMS) has a rel=canonical pointing to their home page. I'm feeling really dumb, because this seems like a fatal flaw which would keep Google from ranking any page other than the home page... but when I look at Google Analytics, Content > Site Content > Landing Pages, using Secondary Dimension = Source, it seems that Google is delivering users to numerous pages on their site. Can anyone help me out?! Thanks very much!!
Intermediate & Advanced SEO | | measurableROI0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0