Does google detect all updated page with new links
-
as paid links?
Example: A PR 4 page updates the page a year later with new links. Does Google discredit these links as being fishy?
-
If you're worried about this then ask the site to do a larger content update on their page rather than just changing a link. Get them to add an editorial note that says when the update happened and why.
A lot of "keyword sponsorship" services only change phrases into links and don't actually update the content. Plus if ti comes down to a manual review you've almost got proof that it's not just some crappy links you've paid for.
-
There are two pieces of advice I can share:
1. As an SEO you should be intimately familiar with Google's Guidelines: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769
2. As long as you are following the guidelines in good faith, you should have nothing to worry about at all.
If you sincerely practice white hat SEO these are not things you need to worry about. On the other hand if you step into grey/black hat at times, then you have a good reason to be concerned.
-
yeah def. But what may seem like legitimate and white hat webmaster to webmaster outreach clearly also resembles paid links. Just hoping to not get penalized for something that represents something else. Which seems harder and harder with every algo update
-
does google detect all updated page with new links as paid links?
No.
There are plenty of legitimate reasons to update an older page. For example, an article may have been written in 2010 on a given topic such as "Best Vacation Hideaways in Hawaii". In 2012, another author may write a similar article. The original article author may then offer a link to the newer article stating "for more ideas see [insert new article link].
The above is a legitimate example of why an older article may receive a new link.
With the above noted, almost any authentic technique can be transformed to a black hat tactic. Google has numerous methods for detecting pages which try to manipulate links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
Different snippet in Google for same page
Hello, I have a question regarding the snippet of a specific case: When i search the homepage by searching the business name, i find the correct snippet of the homepage (with the meta description that was entered). If i search it via site:www. it still show the default meta description. Has anybody had experience with this? Is there a way to change the snippet of site:www.? Does it influence SEO? Thank you!
Intermediate & Advanced SEO | | conversal0 -
Best way to link to 1000 city landing pages from index page in a way that google follows/crawls these links (without building country pages)?
Currently we have direct links to the top 100 country and city landing pages on our index page of the root domain.
Intermediate & Advanced SEO | | lcourse
I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country.
I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page.
I would still like google to be able to crawl and follow these links to cities that I load dynamically later. In this particular case typical site hiearchy of country pages with links to all cities is not an option. Any recommendations on how best to implement?0 -
Pages with excessive number of links
Hi all, I work for a retailer and I've crawled our website with RankTracker for optimization suggestions. The main suggestion is "Pages with excessive number of links: 4178" The page with the largest amount of links has 634 links (627 internal, 7 external), the lowest 382 links (375 internal, 7 external). However, when I view the source on any one of the example pages, it becomes obvious that the site's main navigation header contains 358 links, so every new page starts with 358 links before any content. Our rivals and much larger sites like argos.co.uk appear to have just as many links in their main navigation menu. So my questions are: 1. Will these excessive links really be causing us a problem or is it just 'good practice' to have fewer links
Intermediate & Advanced SEO | | Bee159
2. Can I use 'no follow' to stop Google etc from counting the 358 main navigation links
3. Is have 4000+ pages of your website all dumbly pointing to other pages a help or hindrance?
4. Can we 'minify' this code so it's cached on first load and therefore loads faster? Thank you.0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
How to detect a bad neighborhood links?
I have the feeling that I am suffering from negative seo, so there is a way to get a list of links that should remove in the google disavow links tool ?
Intermediate & Advanced SEO | | Valarlf0 -
New links not showing in site explorer ?
I have built links to my site this past month that I know are live and in place and some do follow and some no follow ... Are the no follow links just not going to show up in my site explorer data ? And the others - why would they not be showing up yet ? SeoMoz updated thier link data aug 1st , my site has been crawled since then , but this new work I have done for link building have not shown up - None of them ? Its like I did not do any work ? how long could it take for them to show up and affect my site trust ect ? Also is there anything I vould be doing to speed the process up of having the new links found ?
Intermediate & Advanced SEO | | jlane90 -
Google, Links and Javascript
So today I was taking a look at http://www.seomoz.org/top500 page and saw that the AddThis page is currently at the position 19. I think the main reason for that is because their plugin create, through javascript, linkbacks to their page where their share buttons reside. So any page with AddThis installed would easily have 4/5 linbacks to their site, creating that huge amount of linkbacks they have. Ok, that pretty much shows that Google doesn´t care if the link is created in the HTML (on the backend) or through Javascript (frontend). But heres the catch. If someones create a free plugin for wordpress/drupal or any other huge cms platform out there with a feature that linkbacks to the page of the creator of the plugin (thats pretty common, I know) but instead of inserting the link in the plugin source code they put it somewhere else, wich then is loaded with a javascript code (exactly how AddThis works). This would allow the owner of the plugin to change the link showed at anytime he wants. The main reason for that would be, dont know, an URL address update for his blog or businness or something. However that could easily be used to link to whatever tha hell the owner of the plugin wants to. What your thoughts about this, I think this could be easily classified as White or Black hat depending on what the owners do. However, would google think the same way about it?
Intermediate & Advanced SEO | | bemcapaz0