Are links on a press page considered "reciprocal linking"?
-
Hi,
We have a press page with a list of links to the articles that have mentioned us (most of which also have a link to our website).
Is there any SEO impact with this approach? Does Google consider these reciprocal links? And if so, would making the links on the press page 'nofollow' solve the issue?
-
It shouldn't matter at all. If you're worried the sites are low quality, you might consider no following, but press link pages are pretty common.
-
I don´t think its a problem, different would be to have a such links in the footer / sidebar, but just in one url should be OK.
Anyway, nofollow them just in case
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Fake" market research reports killing SEO
Our robotics company is in a fast growing, competitive market. There are an assortment of "market research" companies who are distributing press releases about their research reports (which are of less than dubious quality). These announcements end up being distributed through channels with high domain authority. The announcements mention many companies in the space that the purported report covers - including ours. As a result, our company name and product brand is suffering since the volume of press announcements is swamping our ratings. What would you do? Start writing blog postings on topics and post through inexpensive news feeds? Somehow contact the firms posting the contact and let them know they are in violation of our trademarks by mentioning our name? Other ideas?
White Hat / Black Hat SEO | | amelanson1 -
Canonical tags being direct to "page=all" pages for an Ecommerce website
I find it alarming that my client has canonical tags pointing to "page=all" product gallery pages. Some of these product gallery pages have over 100 products and I think this could effect load time, especially for mobile. I would like to get some insight from the community on this, thanks!
White Hat / Black Hat SEO | | JMSCC0 -
Strange strategy from a competitor. Is this "Google Friendly"?
Hi all,We have a client from a very competitive industry (car insurance) that ranks first for almost every important and relevant keyword related to car insurance.
White Hat / Black Hat SEO | | sixam
But they could always be doing a good job. A few days ago i found this: http://logo.force.com/ The competitor website is: http://www.logo.pt/ The competitor name is: Logo What I found strange is the fact that both websites are the same, except the fact that the first is in a sub-domain and have important links pointing to the original website (www.logo.pt) So my question is, is this a "google friendly" (and fair) technique? why this competitor has such good results? Thanks in advance!! I look forward to hearing from you guys0 -
Page not being indexed or crawled and no idea why!
Hi everyone, There are a few pages on our website that aren't being indexed right now on Google and I'm not quite sure why. A little background: We are an IT training and management training company and we have locations/classrooms around the US. To better our search rankings and overall visibility, we made some changes to the on page content, URL structure, etc. Let's take our Washington DC location for example. The old address was: http://www2.learningtree.com/htfu/location.aspx?id=uswd44 And the new one is: http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training All of the SEO changes aren't live yet, so just bear with me. My question really regards why the first URL is still being indexed and crawled and showing fine in the search results and the second one (which we want to show) is not. Changes have been live for around a month now - plenty of time to at least be indexed. In fact, we don't want the first URL to be showing anymore, we'd like the second URL type to be showing across the board. Also, when I type into Google site:http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training I'm getting a message that Google can't read the page because of the robots.txt file. But, we have no robots.txt file. I've been told by our web guys that the two pages are exactly the same. I was also told that we've put in an order to have all those old links 301 redirected to the new ones. But still, I'm perplexed as to why these pages are not being indexed or crawled - even manually submitted it into Webmaster tools. So, why is Google still recognizing the old URLs and why are they still showing in the index/search results? And, why is Google saying "A description for this result is not available because of this site's robots.txt" Thanks in advance! Pedram
White Hat / Black Hat SEO | | CSawatzky0 -
Duplicate content showing on local pages
I have several pages which are showing duplicate content on my site for web design. As its a very competitive market I had create some local pages so I rank high if someone is searching locally i.e web design birmingham, web design tamworth etc.. http://www.cocoonfxmedia.co.uk/web-design.html http://www.cocoonfxmedia.co.uk/web-design-tamworth.html http://www.cocoonfxmedia.co.uk/web-design-lichfield.html I am trying to work out what is the best way reduce the duplicate content. What would be the best way to remove the duplicate content? 1. 301 redirect (will I lose the existing page) to my main web design page with the geographic areas mentioned. 2. Re write the wording on each page and make it unique? Any assistance is much appreciated.
White Hat / Black Hat SEO | | Cocoonfxmedia0 -
Pages Getting Deindexed
My Question Is I have 16 pages on my site that were all indexed until yesterday now there are only 3 indexed. I tried resubmitting my site map, and when i did it was the same result as before 3 pages indexed and 13 pages deindexed. I was wondering if someone could explain to me why this is happening and what I can do to fix it? Keep in mind my site is almost three months old, and this has happened before but, it fixed itself over time thanks.
White Hat / Black Hat SEO | | ilyaelbert0 -
Internal Link Structure
Hello Everyone, I'd be grateful for a little feedback please; This is my site, the home page
White Hat / Black Hat SEO | | TwoPints
of which is targeting the phrase jobs in **** (I'm sure you can fill i the gap
:)) I've made a few changes recently which has included having the
Contract jobs in **** | Permanent Jobs in **** | Temporary Jobs in **** & Today’s
jobs in **** links added to the homepage... Perhaps foolishly and impatiently, I did all of these at the
same time, whilst also changing the sites internal link structure, specifically
for all links to the homepage, which previously were like <a<br>href="/">Home and have now been changed to <a<br>href="/">jobs in ****</a<br></a<br> Meaning that I have 4500 internal links with the anchor text
'jobs in ****' But rather than seeing an improvement n my SERPs ranking, I have
gone from page 2 of Google to page 6, and falling...... Apart from being inpatient, what have I done wrong? Many thanks0 -
Influence of users' comments on a page (on-page SEO)
Do you think when Google crawls your page, it "monitors" comments updates to use this as a ranking factor? If Google is looking for social signs, looking for comments updates might be a social sign as well (ok a lot easier to manipulate, but still social). thx
White Hat / Black Hat SEO | | gt30