Getting Authority Social Blogs to Index
-
We have a few authority blogs that I manage to help increase our brand awareness and build power to our website. We have Blogspot, Wordpress, Tumblr & Typepad. Our content get's a summary syndicated to our authority blogs with an attribution link back to the original post. I also manually check them one a month to make sure it looks good and the content syndicated correctly. I even add unique content to these blogs once in awhile. I recently realized that the majority of the pages are not indexing. I added the blogs to our GSC & Bing webmasters and submitted the sitemaps. This was done on December 11th, as of now some pages indexed in Google and Bing says the sitemaps are still pending...
- Blogspot - 32/100 pages indexed
- Wordpress - 34/81 pages indexed
- Tumblr - 4/223 pages indexed
- Typepad - 3/63 pages indexed
Can anyone help me figure out why I can't get Google to index more pages or Bing to process the sitemaps timely?
-
Thank you!!!! I've printed up your responses and applying your suggestions to my 2018 strategy. I've been using similar methods but this really breaks it down and gives me what I need to make an actual organized game plan going forward.
-
what tool or strategy did you use to determine link prospects?
Buzzstream is really good tool, for me is a really good CRM to keep in order my
links prospect but is not even close to being a decent "links prospect generator"Please don't get me wrong Buzzstream is a nice tool, I use it regularly to organize
my links prospect but I do not generate them with Buzzstream I just use it to follow up themBy Order, these are the better tools for that
- Bing
Then you have
- Semrush
- Majestic
- Ahrefs
There is no a magic tool at least I don't know one. I use the API for all my tools
(Semrush, Majestic, Ahrefs and so on) to collect data, then organize it and repeat the process
over and over again at the beginning looks like a chaotic process but once you that over and over again start to recognize the patterns.It is a repetitive, tedious and time-consuming process that's why I created my own script.
And base on my experience the best SEO do the same (Create their own framework)In fact, this is how Moz was born. Started as SEOmoz agency
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
-
Hi, Lindsay, I'm glad I was useful and have brought something positive
In my case, I use Moz, Semrush, Majestic, Ahrefs and Raven Tools. All of them are really goods tools
How to determinate how many links you will need to rank on the first page?
Well, in that case, you have two options the manual, hard and slow way but very accurate and the easy and fast in my case I use the second to make a quick research like your case.
THE MANUAL WAY
With Moz
1. Select the keyword, in this case, we will use social security increase
2. Go to Moz Pro > Keyword Explorer > SERP Analysis
3. See full analysis and Export CSV
4. On that case, you will have the first 10 results for that specific keyword
5. Moz will give you this numbersMonthly Volume 1.7k-2.9k
Difficulty 62
Organic CTR 61%
Priority 616. Take every URL and run an audit with Open Site Explorer
In this case, the first result will be https://www.ssa.gov/news/cola/Domain Authority 94
Page Authority 78
It has 120 Root Domains
It has 462 Total LinksMake a deep analysis
Link Source
- External Links
- Internal Links
Link Type
- Follow
- No Follow
As an example
- Target > this page
- Link Source > Only External
- Link Type > Only Follow
Repeat the process over and over again until you get the job done
you can use Excel to collect the data or you can download the CVSWith Semrush
1. Select the keyword, in this case, we will use social security increase
2. Go to Semrush > Keyword Analytics > Organic Search Results > Export
3. Go to Semrush > Keyword Analytics > Keyword Difficulty ToolDifficulty 90.72
Volume 5904. Once you have downloaded all the URLs on Semrush (Top 10)
5. Analyze every one with Semrush
6. Semrush > Domain Analytics and again collect the data on excelWith those numbers, you will have the answer to your question
Keep in mind all those top 10 pages are big websites likeSo you will not beat them in this world or any other world or even on any other dimension
But you can use Moz, Semrush and Long Tail Pro to found some interesting long tail keywords easy to rank
and If you make your homework and re-write the content as memorable as you can
(I'm not a copywriter so I have someone in my team for that but base on my experience a good article can cost you 20$)
Found 10 or 20 Keywords focus on them, create outstanding content around those keywords found links prospect
and try to outreach them. At the end of the day, you will have a sustainable SEO strategy (Long Term SEO), not something that you pull a trick today and be gone from the search result tomorrow.NOTE: I run this tasks on an automated process (Use the API from Moz, Semrush, Majestic, ect_)_
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
-
Yes, very nice work Roman, thank you! I really appreciate your research and well thought out response.
Using your example...
I don't have Ahrefs, we use SEMRush. Pretty sure they have the same features overall. I also use Long Tail Pro, MOZ, Majestic, etc.
How did you determine this---> "You'll need backlinks from ~32 websites to rank in top 10 for this keyword"
Also, what tool or strategy did you use to determine link prospects? Where these the backlinks of those ranking for the keyword? We have buzzstream, it's a great tool for link prospecting as well.
Regarding adding lists, info graphics, statistic numbers, etc... that's on my Q1 to do list for sure. We just hired an in house developer/designer who's going to help me with this.
Thank you again!
-
Nice work, Roman.
What a generous and informative reply.
-
EGOL is right so I want to add some value to you, from my point of view (Personal Opinion based on my experience)
This is what I would do in your case
- Forget your blogs
- Analysis the articles of your main website blog
- Get some keywords useful for those post
- Make a link prospect to your post
- 20 good links pointing to a single article can give you more traffic than all your network together.
Let's take this article of your site as an example
https://www.medicarefaq.com/blog/social-security-benefit-increase-announced-2018/Lets take
social security increase as the main keyword and let see some numbers from Ahrefs- volume 4,300
- Traffic potential 31,000
- Keyword difficulty 27/100
- You'll need backlinks from ~32 websites to rank in top 10 for this keyword
So with this information in mind, I will create a list of link prospect
I made a 10 minutes research a get 150 Links Prospect with this Blog Ranking Criteria:- NOT A PBN
- Domain Authority
- Non-spun Content
- Unique IP's
- Free or Paid
- Type Of Link Given
- Indexed in Google
These are some examples of what I founded
findhealthtips.com > PA 31
diabeticlifestyle.com > PA 38
bhma.org > PA 39Another good tip is re-write the article and convert it into the most memorable article related to _social security increase,_with lists, infographics, and statistics numbers, maybe some videos of relevant people talking about the topic, internal and external links to related content.
I'm pretty sure if you do that will give you more brand awareness, mentions, authority and traffic that all your content network.
-
"Giving attribution links does not change the fact that they are duplicate and thin content. Why should Google index them? They do not provide anything new to the web. Google will either send these pages to the supplemental index or not index them at all. If they are indexed they will eventually fall out of the index."
I can see your point on this... and obviously that's the case with mine since only the content that's unique seems to be staying indexed.
"Yes, but if you can get unique and valuable links, you will be better off getting them pointing straight at your main domain."
Yes, I usually only focus on our main domain. Constantly looking for high authority links and guest blog opportunities.
-
Great answer!
-
"It's not considered duplicate or thin content because you're giving an attribution link back to the original content."
Giving attribution links does not change the fact that they are duplicate and thin content. Why should Google index them? They do not provide anything new to the web. Google will either send these pages to the supplemental index or not index them at all. If they are indexed they will eventually fall out of the index.
"Do you mean to get link value I need to have more authority backlinks to my blogspot or external links to other authority sites?"
Yes, but if you can get unique and valuable links, you will be better off getting them pointing straight at your main domain.
-
The strategy applied here is from SerpSpace syndication networks. From my experience, this strategy works well. It's not considered duplicate or thin content because you're giving an attribution link back to the original content. The blogspot pages that have indexed have shown to positively increase the page SERPs it's linking too within a day of indexing.
"And, the blogspot site will not pass any linkvalue to medicarefaq.com unless it has unique links into it from websites that are outside of your own network and not duplicates of websites that already link to medicarefaq.com."
Do you mean to get link value I need to have more authority backlinks to my blogspot or external links to other authority sites?
-
I visited a few of the post pages on the blogspot site. These pages appear to be simply the first few sentences from the medicarefaq.com website. I don't think that Google will like these pages because they are: A) signposts for medicarefaq.com, B) duplicate content of medicarefaq.com, and C) thin content.
For the blogspot site to be an asset, the content needs to be unique and substantive. And, the blogspot site will not pass any linkvalue to medicarefaq.com unless it has unique links into it from websites that are outside of your own network and not duplicates of websites that already link to medicarefaq.com.
If these were my websites, I would put all of my time into medicarefaq.com and stop working on sites that merely link to it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing our site
We have 700 city pages on our site. We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far. Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?
Intermediate & Advanced SEO | | brianvest0 -
Better Domain and Page Authority Than my compeitors
Hi All, I have a pretty extensive question but wanted a starting point if you don't mind. I have a situation where I created 4 sites that I would say are almost identical other than I have loaned my other websites to other agents. My content is rewritten but it's still roughly the same. You will see, when I give the URL's, that they are similar, and almost identical in templates.My question is going to be, Since I have built some authority on all of these sites, is it wise to simply take them down, or just change the templates and take away the content and start over. If so, what do I do with the existing pages? Or is there a better idea I'm not thinking of? My other question is, this site: goo.gl/Tf00rc Is my main site. It has a higher domain authority and page authority than any of my other local competitors, yet I'm still ranked #13-15 for my main keywords. I will say, many of my other competitors have older domains and I'm sure didn't try to manipulate the serps either. Thoughts and recommendations? Here are my other similar sites which have almost identical templates and very similar content but not copied and pasted content. 1. goo.gl/Wwb0Tg 2. goo.gl/3gpR1X 3. goo.gl/FwD8Bk 4. goo.gl/vpuQv2 My dilemma: I want to make sure that my other agents have a great site that can perform well, as well. If I completely remove these sites, they have no site. I'll say that right now the sites that get the most traffic are the goo.gl/Tf00rc and goo.gl/Wwb0Tg then is the goo.gl3gpR1X, and lastly goo.gl/FwD8Bk so they all get about 3k, 2k, and 1k and 500 visits a month respectively. The total visits of all of these is pretty good. I feel like the max would visits would be around 10k per month in my market. Any help would be greatly appreciated as I have spent a lot of time and money getting these sites where they are only to be penalized, I'm sure, for duplicate content.
Intermediate & Advanced SEO | | Veebs0 -
After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
Hello, I have just performed doing server migration 2 days back All's well with traffic moved to new servers But somehow - it seems that w.r.t previous host that on submitting a new article - it was getting indexed in minutes. Now even after submitting page for indexing - its taking bit of time in coming to Search Engines and some pages wherein content is daily updated - despite submitting for indexing - changes are not getting reflected Site name is - http://www.mycarhelpline.com Have checked in robots, meta tags, url structure - all remains well intact. No unknown errors reports through Google webmaster Could someone advise - is it normal - due to name server and ip address change and expect to correct it automatically or am i missing something Kindly advise in . Thanks
Intermediate & Advanced SEO | | Modi0 -
Canonical or No-index
Just a quick question really. Say I have a Promotions page where I list all current promotions for a product, and update it regularly to reflect the latest offer codes etc. On top of that I have Offer announcement posts for specific promotions for that product, highlighting very briefly the promotion, but also linking back to the main product promotion page which has a the promotion duplicated. So main page is 1000+ words with half a dozen promotions, the small post might be 200 words, and quickly become irrelevant as it is a limited time news article. Now, I don't want the promotion page indexed (unless it has a larger news story attached to the promotion, but for this purpose presume it is doesn't). Initially the core essence of the post will be duplicated in the main Promotion page, but later as the offer expires it wouldn't be. Therefore would you Rel Canonical or just simply No-index?
Intermediate & Advanced SEO | | TheWebMastercom0 -
How can a Page indexed without crawled?
Hey moz fans,
Intermediate & Advanced SEO | | atakala
In the google getting started guide it says **"
Note: **Pages may be indexed despite never having been crawled: the two processes are independent of each other. If enough information is available about a page, and the page is deemed relevant to users, search engine algorithms may decide to include it in the search results despite never having had access to the content directly. That said, there are simple mechanisms such as robots meta tags to make sure that pages are not indexed.
" How can it happen, I dont really get the point.
Thank you0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0 -
Why do my https pages index while noindexed?
I have some tag pages on one of my sites that I meta noindexed. This worked for the http version, which they are canonical'd to but now the https:// version is indexing. The https version is both noindexed and has a canonical to the http version, but they still show up! I even have wordpress set up to redirect all https: to http! For some reason these pages are STILL showing in the SERPS though. Any experience or advice would be greatly appreciated. Example page: https://www.michaelpadway.com/tag/insurance-coverage/ Thanks all!
Intermediate & Advanced SEO | | MarloSchneider0 -
Another Guest Blogging Question!
If you had 1 blog with good mozbar stats but hosted in the US and another with lets say 75% of the mozbar stats of the first but hosted in the UK, and your website is hosted in the UK which one would benefit SEO the most?
Intermediate & Advanced SEO | | activitysuper0