Indexing techniques
-
Hi,
I just want a confirmation about my indexing technique, if is good or can be improved. The technique is totally whitehat and can be done by one person. Any suggestions or improvements are welcome.
- I create the backlinks ofcource first
- I make a list on public doc from Google.
- On the doc are only ten links.
- After I digg it , and add some more bookmarks 5-6.
- I tweet the digg and each doc. (my 2 twitter accounts have page authority 98)
- I like them in Fb.
- I ping them thru ping serviecs.
- Thats it. Works ok for moment.
Is anything what I can do to improve my technique?
Thanks lot
- I create the backlinks ofcource first
-
No is not gaming, is adult but I am thinking also to develop a gaming site , to turn Mine in a gaming site because in Cy no jobs about SEO. They are more gamblers there , And Online I dont think so that I will go good... Also I make more money from affiliate like to work for somebody... Maybe I wasnt so much lucky I guess...But is ok..Im still happy:)
-
Based on your profile, I'm guessing this is a gaming-related site?
-
My goal is about the old pages to get crawled fast. Which contains my links on them. Is not about my pages.
-
Many of them are authority 10-20-30-40, some other are zero. All are indexed pages because I am taking the links from a competitor. Yes some are low quality links but he is ranking number 1 after 2 500 000 exact matches.I just do this effort to speed up the indexing because many of them are not getting indexed fast. I mean I saw some of them that after 1 month start to show up in Webmaster Tools. After this process all are etting indexed in one day maximum. As for the quality links what you are suggesting to get is almost impossible due to the nature of the niche. Nobody want to give them, as this specific keyword is extremely profitable and have millions of searches. I mean the hardest part is to get the already good ones, and build authority for the other what I create new...OHHHH.. Also we are just 2 persons working here...From 1000 links what I visit until now only 60 was possible to get . Stay another 9000 links for checking.....If I get until 600 from his links will be good I guess , my site is already ranking with his keyword, but in position 50 about(just on page optimization)...and is old, pr 2 with 150 likes and some tweets, all real.The new links are builded in the last 2 days so I dont know where it will goes the site . Other bad on this is that they are around 45 exact matches domains under him with the same keyword...Mine is even not in url..
-
I believe you are referring to getting backlinks indexed. The only reason you would need to go to all that effort is if you were building low quality links on deep pages or pages with thin content that Google would not value in their index (e.g. Forum profile links, blog comments) I'm sure you are doing more than enough to get your links indexed but they will become quickly deindexed if Google no longer values the page content. If you are going to all this effort to index a batch low quality links then why not put that same effort into building links on pages with more trust & better quality content that Google will want in their index?
-
IF your goal is to get your webpages indexed, then why not create a sitemap and submit it in GWT? I don't understand why you would go through all that trouble to get your webpages indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I still see the old page in index
Hello, I have done a redirect and still see in google index my old page after 3 weeks. My new page is there also Is it normal that the old page isn't dropped for the index yet ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Difficulty with Indexing Pages - Desperate for Help!
I have a website with product pages that use the same URL, but load different data based on what's passed to them with GET. I am using a Wordpress website, but all of the page information is retrieved from a database using PHP and displayed with PHP. Somehow these pages are not being indexed by Google. I have done the following: 1. Created a site map pointing to each page. 2. Defined URL parameters in Search Console for these type of pages. 3. Created a product schema using schema.org, and tested it without errors. I have requested re-indexing repeatedly and these pages and images on the pages are still not being indexed! Does anybody have any suggestions?
Intermediate & Advanced SEO | | jacleaves0 -
Why Aren't My Images Being Indexed?
Hi, One of my clients submitted an image sitemap with 465 images. It was submitted on July 20 2017 to Google Search Console. None of the submitted images have been indexed. I'm wondering why? Here's the image sitemap: http://www.tagible.com/images_sitemap.xml We do use a CDN for the images, and the images are hosted on a subdomain of the client's site: ex. https://photos.tagible.com/images/Les_Invalides_Court_Of_Honor.jpg Thanks in advance! Cheers,
Intermediate & Advanced SEO | | SEOdub
Julian0 -
How long to re-index a page after being blocked
Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
How can I make a list of all URLs indexed by Google?
I started working for this eCommerce site 2 months ago, and my SEO site audit revealed a massive spider trap. The site should have been 3500-ish pages, but Google has over 30K pages in its index. I'm trying to find a effective way of making a list of all URLs indexed by Google. Anyone? (I basically want to build a sitemap with all the indexed spider trap URLs, then set up 301 on those, then ping Google with the "defective" sitemap so they can see what the site really looks like and remove those URLs, shrinking the site back to around 3500 pages)
Intermediate & Advanced SEO | | Bryggselv.no0 -
Best Format to Index a Large Data Set
Hello Moz, I've been working on a piece of content that has 2 large data sets I have organized into a table that I would like indexed and want to know the best way to code the data for search engines while still providing a good visual experience for users. I actually created the piece 3 times and am deciding on which format to go with and I would love your professional opinions. 1. HTML5 - all the data is coded using tags and contains all the data on page in the . This is the most straight forward method and I know this will get indexed; however, it is also the ugliest looking table and least functional. 2. Java - I used google charts and loaded all the data into a
Intermediate & Advanced SEO | | jwalker880 -
XML Sitemap Indexation Rate Decrease
On September 28th, 2013 I saw my indexation rate decrease on my XML sitemap that I've submitted through GWT. I've since scraped my sitemap and removed all 404, 400 errors (which only made up ~5% of the entire sitemap). Any idea why Google randomly started indexing less of my XML sitemap on that date? I updated my sitemap 2 week before that date and had an indexation rate of ~85% - no I'm below 35%. Thoughts, idea, experiences? Thanks!
Intermediate & Advanced SEO | | RobbieWilliams0 -
Indexing/Sitemap - I must be wrong
Hi All, I would guess that a great number of us new to SEO (or not) share some simple beliefs in relation to Google indexing and Sitemaps, and as such get confused by what Web master tools shows us. It would be great if somone with experience/knowledge could clear this up for once and all 🙂 Common beliefs: Google will crawl your site from the top down, following each link and recursively repeating the process until it bottoms out/becomes cyclic. A Sitemap can be provided that outlines the definitive structure of the site, and is especially useful for links that may not be easily discovered via crawling. In Google’s webmaster tools in the sitemap section the number of pages indexed shows the number of pages in your sitemap that Google considers to be worthwhile indexing. If you place a rel="canonical" tag on every page pointing to the definitive version you will avoid duplicate content and aid Google in its indexing endeavour. These preconceptions seem fair, but must be flawed. Our site has 1,417 pages as listed in our Sitemap. Google’s tools tell us there are no issues with this sitemap but a mere 44 are indexed! We submit 2,716 images (because we create all our own images for products) and a disappointing zero are indexed. Under Health->Index status in WM tools, we apparently have 4,169 pages indexed. I tend to assume these are old pages that now yield a 404 if they are visited. It could be that Google’s Indexed quotient of 44 could mean “Pages indexed by virtue of your sitemap, i.e. we didn’t find them by crawling – so thanks for that”, but despite trawling through Google’s help, I don’t really get that feeling. This is basic stuff, but I suspect a great number of us struggle to understand the disparity between our expectations and what WM Tools yields, and we go on to either ignore an important problem, or waste time on non-issues. Can anyone shine a light on this for once and all? If you are interested, our map looks like this : http://www.1010direct.com/Sitemap.xml Many thanks Paul
Intermediate & Advanced SEO | | fretts0