Google omitting some entries
-
Hi,
I used this tool to test some domains. The tool can be found at
http://www.virante.com/seo-tools/duplicate-content
I have no questions about the other checks but with the similarity check.
My question is how do i get Google not to omit some entries very similar to the top 1000 pages on my site?
Will appreciate your answers, thanks.
Suleman
-
Good response Ryan thank you once again.
-
I am trying to be helpful here. Your original question did not mention anything related to optimizing content. It sounds a whole lot more like you want to manipulate search engines into indexing your duplicate content. I've re-read your question a couple times and that is the only way I am able to interpret your question.
There is not a hard rule in that regard. Google will examine the percentage of duplicate content. They will understand rewording of the same content and view it as a duplication. You need to make the second page truly unique. The best suggestion I can offer is ask a second person to write a page on the same topic without allowing them to see the first page.
You can ask 5 people to write an article on the "Ford Mustang" and you can receive 5 completely different, unique articles. One could focus on resale value, another on the history of the mustang, another on racing, etc.
If you ask one person to reword the same article so it can be indexed a second time, you wont be happy with the results. Even if you do manage to get it indexed, you are one Panda update away from posting the famous "why did my rankings drop" question.
-
Thank you for your contribution Ryan
I am quite aware of writing for readers and always putting them first in everything beside that's what the search engines are interested in isn't in? I mean the readers finding quality relevant content through their engines.
What i am interesting after satisfying the first and primary part is optimizing the content for search engines, isn't that what SEO is all about? If most are providing quality content, all still have to compete for the limited top positions available isn't it?
Thank you for the feedback, will appreciate any other contribution.
-
how do i get Google not to omit some entries very similar to the top 1000 pages on my site?
You would need to change the content enough to where it was not seen as a duplication of the original information. You need to think of users and ask if they read the first article, what would make them want to read the second article?
If you are trying to reword the article for search engines, you are in the wrong mindset. Think of your readers.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Crawl Errors?
We are using Google Search Console to monitor Crawl Errors. It seems Google is listing errors that are not actual errors. For instance, it shows this as "Not found": https://tapgoods.com/products/tapgoods__8_ft_plastic_tables_11_available So the page does not exist, but we cannot find any pages linking to it. It has a tab that shows Linked From, but if I look at the source of those pages, the link is not there. In this case, it is showing the front page (listed twice, both for http and https). Also, one of the pages it shows as linking to the non-existant page above is a non-existant page. We marked all the errors as fixed last week and then this week they came up again. 2/3 are the same pages we marked as fixed last week. Is this an issue with Google Search Console? Are we getting penalized for a non existant issue?
Intermediate & Advanced SEO | | TapGoods0 -
Sitemap Migration - Google Guidelines
Hi all. I saw in support.google.com the following text: Create and save the Sitemap and lists of links A Sitemap file containing the new URL mapping A Sitemap file containing the old URLs to map A list of sites with link to your current content I would like to better understand about a "A list of sites with bond link to current content" Question 1: have I need tree sitemaps simultaneously ?
Intermediate & Advanced SEO | | mobic
Question 2: If yes, should I put this sitemap on the Search Console of the new website?
Question 3: or just Google gave a about context how do we make the migration? And I'll need really have sitemaps about the new site only..? What about is Google talking? Thanks for any advice.0 -
Why Did My Google Crawls Hit A Wall?
Hello, One my the sites I work with, http://www.oransi.com, has seen a significant decrease in crawl Googlebot activity in the last 90 days. See screenshot. This decrease in crawl stats runs in conjunction with less Kb downloaded per day & an increase in how much time it took Google to download a page. The client did just go through a redesign, however that happened on 4/16/15, which was after the decrease in Googlebot activity, so that should not be the issue. Same could be said for the mobilegeddan algorithm change. Any help would be greatly appreciated. 5u1lM6B
Intermediate & Advanced SEO | | BrandLabs0 -
Google and private networks?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates) and still do extremely well. Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
Intermediate & Advanced SEO | | BobAnderson0 -
How does Google know if a backlink is good or not?
Hi, What does Google look at when assessing a backlink? How important is it to get a backlink from a website with relevant content? Ex: 1. Domain/Page Auth 80, website is not relevant. Does not use any of the words in your target term in any area of the website. 2. Domain/Page Auth 40, website is relevant. Uses the words in your target term multiple times across website. Which website example would benefit your SERP's more if you gained a backlink? (and if you can say, how much more would it benefit - low, medium, high).
Intermediate & Advanced SEO | | activitysuper0 -
Got Hit By Google Sandbox. What to do now?
Hi, This is my website: http://goo.gl/fl5a5 I am competing for high competitive keyword : Hemorrhoids Treatment But i started my link building slowly. Like i started with social engagement first. Getting Facebook likes about 500 and tweets i got about 700 retweets in a month. Later i went for building links, i personally wrote 2 squidoo lenses, and wrote about 4 articles for high popular article sites like ezine, amazines, apsense etc. and got approved. And now, i wrote a press release and submitted to prweb, sbwire and pressdoc website. Got 3 press releases. And now, i submitted my site to 2 web directories botw.org and v7n directory and got approved in them. Later i submitted my site to 10 social bookmarking sites manually. Thats it in 1 month. In the mean time, i used to rank for the term Hemorrhoids Treatment on 7th page. Now i am not even in 50th page, completely out of index for that term in the search. But i am still ranking for other keywords. My site has 100% unique content, i built my links with extreme carefull and got quality links only. I have varied my anchor text as much as possible. Still i dont understand why i am not even in top 50 pages atleast. Am i hit by Google's Sandbox?? If so, could someone help in getting out of sandbox? Will be waiting for your answers.
Intermediate & Advanced SEO | | Vegit0 -
Google Translate for Unique Content
We are considering using the Google Translation tool to translate customer reviews into various languages for publication as indexable content both for users and for search engine long tail visibility and rankings. Does anyone have any experience, insights or caveats to share?
Intermediate & Advanced SEO | | edreamsbcn0 -
Google +one button - help needed
I have installed Google plus button on my homepage. My GA report is being displayed like this - 7 total events were recorded Total Events Unique Events 7 2 Event action Total Events on 4 off 3 My questions are - 1 ) Is someone clicking on Google plus button considered an Event ? 2 ) What is meant by 2 Unique Events 3 ) Why does GA report shows on 4 and off 3
Intermediate & Advanced SEO | | seoug_20050