Indexing techniques
-
Hi,
I just want a confirmation about my indexing technique, if is good or can be improved. The technique is totally whitehat and can be done by one person. Any suggestions or improvements are welcome.
- I create the backlinks ofcource first
- I make a list on public doc from Google.
- On the doc are only ten links.
- After I digg it , and add some more bookmarks 5-6.
- I tweet the digg and each doc. (my 2 twitter accounts have page authority 98)
- I like them in Fb.
- I ping them thru ping serviecs.
- Thats it. Works ok for moment.
Is anything what I can do to improve my technique?
Thanks lot
- I create the backlinks ofcource first
-
No is not gaming, is adult but I am thinking also to develop a gaming site , to turn Mine in a gaming site because in Cy no jobs about SEO. They are more gamblers there , And Online I dont think so that I will go good... Also I make more money from affiliate like to work for somebody... Maybe I wasnt so much lucky I guess...But is ok..Im still happy:)
-
Based on your profile, I'm guessing this is a gaming-related site?
-
My goal is about the old pages to get crawled fast. Which contains my links on them. Is not about my pages.
-
Many of them are authority 10-20-30-40, some other are zero. All are indexed pages because I am taking the links from a competitor. Yes some are low quality links but he is ranking number 1 after 2 500 000 exact matches.I just do this effort to speed up the indexing because many of them are not getting indexed fast. I mean I saw some of them that after 1 month start to show up in Webmaster Tools. After this process all are etting indexed in one day maximum. As for the quality links what you are suggesting to get is almost impossible due to the nature of the niche. Nobody want to give them, as this specific keyword is extremely profitable and have millions of searches. I mean the hardest part is to get the already good ones, and build authority for the other what I create new...OHHHH.. Also we are just 2 persons working here...From 1000 links what I visit until now only 60 was possible to get . Stay another 9000 links for checking.....If I get until 600 from his links will be good I guess , my site is already ranking with his keyword, but in position 50 about(just on page optimization)...and is old, pr 2 with 150 likes and some tweets, all real.The new links are builded in the last 2 days so I dont know where it will goes the site . Other bad on this is that they are around 45 exact matches domains under him with the same keyword...Mine is even not in url..
-
I believe you are referring to getting backlinks indexed. The only reason you would need to go to all that effort is if you were building low quality links on deep pages or pages with thin content that Google would not value in their index (e.g. Forum profile links, blog comments) I'm sure you are doing more than enough to get your links indexed but they will become quickly deindexed if Google no longer values the page content. If you are going to all this effort to index a batch low quality links then why not put that same effort into building links on pages with more trust & better quality content that Google will want in their index?
-
IF your goal is to get your webpages indexed, then why not create a sitemap and submit it in GWT? I don't understand why you would go through all that trouble to get your webpages indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to get Google to stop indexing an old site!
Howdy, I have a small dilemma. We built a new site for a client, but the old site is still ranking/indexed and we can't seem to get rid of it. We setup a 301 from the old site to the new one, as we have done many times before, but even though the old site is no longer live and the hosting package has been cancelled, the old site is still indexed. (The new site is at a completely different host.) We never had access to the old site, so we weren't able to request URL removal through GSC. Any guidance on how to get rid of the old site would be very appreciated. BTW, it's been about 60 days since we took these steps. Thanks, Kirk
Intermediate & Advanced SEO | | kbates0 -
No-Indexing on Ecommerce site
Hi Our site has a lot of similar/lower quality product pages which aren't a high priority - so these probably won't get looked at in detail to improve performance as we have over 200,000 products . Some of them do generate a small amount of revenue, but an article I read suggested no-indexing pages which are of little value to improve site performance & overall structure. I wanted to find out if anyone had done this and what results they saw? Will this actually improve rankings of our focus areas? It makes me a bit nervous to just block pages so any advice is appreciated 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Should I use noindex or robots to remove pages from the Google index?
I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?
Intermediate & Advanced SEO | | Tylerj0 -
[wtf] Mysterious Homepage De-Indexing
Our homepage, as well as several similar landing pages, have vanished from the index. Could you guys review the below pages to make sure I'm not missing something really obvious?! URLs: http://www.grammarly.com http://www.grammarly.com/plagiarism-checker It's been four days, so it's not just a temporary fluctuation The pages don't have a "noindex" tag on them and aren't being excluded in our robots.txt There's no notification about a penalty in WMT Clues: WMT is returning an "HTTP 200 OK" for Fetch, is showing a redirect to grammarly.com/1 (alternate version of homepage, contains rel=canonical back to homepage) for Fetch+Render. Could this be causing a circular redirect? Some pages on our domain are ranking fine, e.g. https://www.google.com/search?q=grammarly+answers A month ago, we redesigned the pages in question. The new versions are pretty script-heavy, as you can see. We don't have a sitemap set up yet. Any ideas? Thanks in advance, friends!
Intermediate & Advanced SEO | | ipancake0 -
Why would one of our section pages NOT be indexed by Google?
One of our higher traffic section pages is not being indexed by Google. The products that reside on this section page ARE indexed by Google and are on page 1. So why wouldn't the section page be even listed and indexed? The meta title is accurate, meta description is good. I haven't received any notices in Webmaster Tools. Is there a way to check to see if OTHER pages might also not be indexed? What should a small ecom site do to see about getting it listed? SOS in Modesto. Ron
Intermediate & Advanced SEO | | yatesandcojewelers0 -
Indexing a several millions pages new website
Hello everyone, I am currently working for a huge classified website who will be released in France in September 2013. The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it. Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ? The website will cover about 300 jobs : In all region (= 300 * 22 pages) In all departments (= 300 * 101 pages) In all cities (= 300 * 37 000 pages) Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ? More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ? One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet. Thanks for your help ! Best Regards, Raphael
Intermediate & Advanced SEO | | Pureshore0 -
Getting a Facebox Item De-Indexed
Hello all, Select pages on a website I manage utilizes Facebox lightbox elements for additional information. These Faceboxes have been indexed as their own pages by Google. Unfortunately, when they were created, there is no call to action or navigation elements or anything. So I would prefer that Google not index them since it is a pretty horrible user experience if you navigate to one of these Faceboxes directly. I have tossed up no-index tags on each about three weeks ago, but they are still being indexed at present. Is there any tips or tricks that anyone has to handle this scenario or any ideas that I am not thinking of outside of just no-indexing them? Thanks in advance!
Intermediate & Advanced SEO | | ClayPotCreative0 -
Why are so many pages indexed?
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
Intermediate & Advanced SEO | | MichaelWeisbaum0