Getting Authority Social Blogs to Index
-
We have a few authority blogs that I manage to help increase our brand awareness and build power to our website. We have Blogspot, Wordpress, Tumblr & Typepad. Our content get's a summary syndicated to our authority blogs with an attribution link back to the original post. I also manually check them one a month to make sure it looks good and the content syndicated correctly. I even add unique content to these blogs once in awhile. I recently realized that the majority of the pages are not indexing. I added the blogs to our GSC & Bing webmasters and submitted the sitemaps. This was done on December 11th, as of now some pages indexed in Google and Bing says the sitemaps are still pending...
- Blogspot - 32/100 pages indexed
- Wordpress - 34/81 pages indexed
- Tumblr - 4/223 pages indexed
- Typepad - 3/63 pages indexed
Can anyone help me figure out why I can't get Google to index more pages or Bing to process the sitemaps timely?
-
Thank you!!!! I've printed up your responses and applying your suggestions to my 2018 strategy. I've been using similar methods but this really breaks it down and gives me what I need to make an actual organized game plan going forward.
-
what tool or strategy did you use to determine link prospects?
Buzzstream is really good tool, for me is a really good CRM to keep in order my
links prospect but is not even close to being a decent "links prospect generator"Please don't get me wrong Buzzstream is a nice tool, I use it regularly to organize
my links prospect but I do not generate them with Buzzstream I just use it to follow up themBy Order, these are the better tools for that
- Bing
Then you have
- Semrush
- Majestic
- Ahrefs
There is no a magic tool at least I don't know one. I use the API for all my tools
(Semrush, Majestic, Ahrefs and so on) to collect data, then organize it and repeat the process
over and over again at the beginning looks like a chaotic process but once you that over and over again start to recognize the patterns.It is a repetitive, tedious and time-consuming process that's why I created my own script.
And base on my experience the best SEO do the same (Create their own framework)In fact, this is how Moz was born. Started as SEOmoz agency
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
-
Hi, Lindsay, I'm glad I was useful and have brought something positive
In my case, I use Moz, Semrush, Majestic, Ahrefs and Raven Tools. All of them are really goods tools
How to determinate how many links you will need to rank on the first page?
Well, in that case, you have two options the manual, hard and slow way but very accurate and the easy and fast in my case I use the second to make a quick research like your case.
THE MANUAL WAY
With Moz
1. Select the keyword, in this case, we will use social security increase
2. Go to Moz Pro > Keyword Explorer > SERP Analysis
3. See full analysis and Export CSV
4. On that case, you will have the first 10 results for that specific keyword
5. Moz will give you this numbersMonthly Volume 1.7k-2.9k
Difficulty 62
Organic CTR 61%
Priority 616. Take every URL and run an audit with Open Site Explorer
In this case, the first result will be https://www.ssa.gov/news/cola/Domain Authority 94
Page Authority 78
It has 120 Root Domains
It has 462 Total LinksMake a deep analysis
Link Source
- External Links
- Internal Links
Link Type
- Follow
- No Follow
As an example
- Target > this page
- Link Source > Only External
- Link Type > Only Follow
Repeat the process over and over again until you get the job done
you can use Excel to collect the data or you can download the CVSWith Semrush
1. Select the keyword, in this case, we will use social security increase
2. Go to Semrush > Keyword Analytics > Organic Search Results > Export
3. Go to Semrush > Keyword Analytics > Keyword Difficulty ToolDifficulty 90.72
Volume 5904. Once you have downloaded all the URLs on Semrush (Top 10)
5. Analyze every one with Semrush
6. Semrush > Domain Analytics and again collect the data on excelWith those numbers, you will have the answer to your question
Keep in mind all those top 10 pages are big websites likeSo you will not beat them in this world or any other world or even on any other dimension
But you can use Moz, Semrush and Long Tail Pro to found some interesting long tail keywords easy to rank
and If you make your homework and re-write the content as memorable as you can
(I'm not a copywriter so I have someone in my team for that but base on my experience a good article can cost you 20$)
Found 10 or 20 Keywords focus on them, create outstanding content around those keywords found links prospect
and try to outreach them. At the end of the day, you will have a sustainable SEO strategy (Long Term SEO), not something that you pull a trick today and be gone from the search result tomorrow.NOTE: I run this tasks on an automated process (Use the API from Moz, Semrush, Majestic, ect_)_
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
-
Yes, very nice work Roman, thank you! I really appreciate your research and well thought out response.
Using your example...
I don't have Ahrefs, we use SEMRush. Pretty sure they have the same features overall. I also use Long Tail Pro, MOZ, Majestic, etc.
How did you determine this---> "You'll need backlinks from ~32 websites to rank in top 10 for this keyword"
Also, what tool or strategy did you use to determine link prospects? Where these the backlinks of those ranking for the keyword? We have buzzstream, it's a great tool for link prospecting as well.
Regarding adding lists, info graphics, statistic numbers, etc... that's on my Q1 to do list for sure. We just hired an in house developer/designer who's going to help me with this.
Thank you again!
-
Nice work, Roman.
What a generous and informative reply.
-
EGOL is right so I want to add some value to you, from my point of view (Personal Opinion based on my experience)
This is what I would do in your case
- Forget your blogs
- Analysis the articles of your main website blog
- Get some keywords useful for those post
- Make a link prospect to your post
- 20 good links pointing to a single article can give you more traffic than all your network together.
Let's take this article of your site as an example
https://www.medicarefaq.com/blog/social-security-benefit-increase-announced-2018/Lets take
social security increase as the main keyword and let see some numbers from Ahrefs- volume 4,300
- Traffic potential 31,000
- Keyword difficulty 27/100
- You'll need backlinks from ~32 websites to rank in top 10 for this keyword
So with this information in mind, I will create a list of link prospect
I made a 10 minutes research a get 150 Links Prospect with this Blog Ranking Criteria:- NOT A PBN
- Domain Authority
- Non-spun Content
- Unique IP's
- Free or Paid
- Type Of Link Given
- Indexed in Google
These are some examples of what I founded
findhealthtips.com > PA 31
diabeticlifestyle.com >Â PA 38
bhma.org > PA 39Another good tip is re-write the article and convert it into the most memorable article related to _social security increase,_with lists, infographics, and statistics numbers, maybe some videos of relevant people talking about the topic, internal and external links to related content.
I'm pretty sure if you do that will give you more brand awareness, mentions, authority and traffic that all your content network.
-
"Giving attribution links does not change the fact that they are duplicate and thin content. Why should Google index them? They do not provide anything new to the web. Google will either send these pages to the supplemental index or not index them at all. If they are indexed they will eventually fall out of the index."
I can see your point on this... and obviously that's the case with mine since only the content that's unique seems to be staying indexed.
"Yes, but if you can get unique and valuable links, you will be better off getting them pointing straight at your main domain."
Yes, I usually only focus on our main domain. Constantly looking for high authority links and guest blog opportunities.
-
Great answer!
-
"It's not considered duplicate or thin content because you're giving an attribution link back to the original content."
Giving attribution links does not change the fact that they are duplicate and thin content. Why should Google index them? They do not provide anything new to the web. Google will either send these pages to the supplemental index or not index them at all. If they are indexed they will eventually fall out of the index.
"Do you mean to get link value I need to have more authority backlinks to my blogspot or external links to other authority sites?"
Yes, but if you can get unique and valuable links, you will be better off getting them pointing straight at your main domain.
-
The strategy applied here is from SerpSpace syndication networks. From my experience, this strategy works well. It's not considered duplicate or thin content because you're giving an attribution link back to the original content. The blogspot pages that have indexed have shown to positively increase the page SERPs it's linking too within a day of indexing.
"And, the blogspot site will not pass any linkvalue to medicarefaq.com unless it has unique links into it from websites that are outside of your own network and not duplicates of websites that already link to medicarefaq.com."
Do you mean to get link value I need to have more authority backlinks to my blogspot or external links to other authority sites?
-
I visited a few of the post pages on the blogspot site. These pages appear to be simply the first few sentences from the medicarefaq.com website.  I don't think that Google will like these pages because they are: A) signposts for medicarefaq.com, B) duplicate content of medicarefaq.com, and C) thin content.
For the blogspot site to be an asset, the content needs to be unique and substantive. And, the blogspot site will not pass any linkvalue to medicarefaq.com unless it has unique links into it from websites that are outside of your own network and not duplicates of websites that already link to medicarefaq.com.
If these were my websites, I would put all of my time into medicarefaq.com and stop working on sites that merely link to it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Index Pages become No-Index
Hi Mozzers, Here is the scenario: I created a landing page targeting Holiday keywords for the holiday season. Â The page has been crawled and indexed - I see my landing page in the SERP. Â However, because of the CMS layout, since the Holiday is over and I don't want it to be displayed on the homepage, i have to remove the page from hp which makes it no-index (don't ask why, it's how the CMS was built). Question: How does this affect this LP's search? Â Since it's already crawled and etc. will it still be on the SERP after i change the page to no-index? If I remove the no-index next year for the holiday season, how does this all play out? Any insights or information provided will be appreciated. Thank you!
Intermediate & Advanced SEO | | TommyTan0 -
Blog Integration
I have a blog and website, .. I have it under the same domain but it has its own login and dashboard ect..I would like to fully integrate it to my website so it becomes a part of my main navigation... If anyone know's how to do this or even has an idea on where to start it is greatly appreciated
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Index or not index Categories
We are using Yoast Seo plugin. On the main menu we have only categories which has consist of posts and one page. We have category with villas, category with villa hotels etc. Initially we set to index and include in the sitemap posts and excluded categories, but I guess it was not correct. Would be a better way to index and include categories in the sitemap and exclude the posts in order to avoid the duplicate? It somehow does not make sense for me, If the posts are excluded and the categories included, will not then be the categories empty for google? I guess I will get crazy of this. Somebody has perhaps more experiences with this?
Intermediate & Advanced SEO | | Rebeca10 -
Duplicate Sub-domains Being Indexed
Hi all, I have this site that has a sub-domain that is meant to be a "support" for clients. Some sort of FAQ pages, if you will. A lot of them are dynamic URLs, hence, the title and most of the content are duplicated. Crawl Diagnostics found 52 duplicate content, 138 duplicate title and a lot other errors. My question is, what would be the best practice to fix this issue? Should I noindex and nofollow all of its subdomains? Thanks in advance.
Intermediate & Advanced SEO | | EdwardDennis0 -
How to get a full blended listing
Hi people, After much hard work our site www.noyelling.com.au is doing very well in the SERPS in Brisbane. The only problem is that we've gone from having a full blended listing to what can be seen in the attached image - a map-pack listing. While we are number 1, I don't feel that having the listing as it is is offering anywhere near the value we'd receive from a full blended maps/organic listing. Any ideas on how a full listing can be achieved so that we can be number 1 and be separate from the map-pack we are part of currently? Thanks, Matt Williams www.noyelling.com.au 0t3I1Z2x3W3n3F08342R
Intermediate & Advanced SEO | | duncan2740 -
Pages un-indexed in my site
My current website www.energyacuity.com has had most pages indexed for more than a year. However, I tried cache a few of the pages, and it looks the only one that is now indexed by Goggle is the homepage. Any thoughts on why this is happening?
Intermediate & Advanced SEO | | abernatj0