Getting Authority Social Blogs to Index
-
We have a few authority blogs that I manage to help increase our brand awareness and build power to our website. We have Blogspot, Wordpress, Tumblr & Typepad. Our content get's a summary syndicated to our authority blogs with an attribution link back to the original post. I also manually check them one a month to make sure it looks good and the content syndicated correctly. I even add unique content to these blogs once in awhile. I recently realized that the majority of the pages are not indexing. I added the blogs to our GSC & Bing webmasters and submitted the sitemaps. This was done on December 11th, as of now some pages indexed in Google and Bing says the sitemaps are still pending...
- Blogspot - 32/100 pages indexed
- Wordpress - 34/81 pages indexed
- Tumblr - 4/223 pages indexed
- Typepad - 3/63 pages indexed
Can anyone help me figure out why I can't get Google to index more pages or Bing to process the sitemaps timely?
-
Thank you!!!! I've printed up your responses and applying your suggestions to my 2018 strategy. I've been using similar methods but this really breaks it down and gives me what I need to make an actual organized game plan going forward.
-
what tool or strategy did you use to determine link prospects?
Buzzstream is really good tool, for me is a really good CRM to keep in order my
links prospect but is not even close to being a decent "links prospect generator"Please don't get me wrong Buzzstream is a nice tool, I use it regularly to organize
my links prospect but I do not generate them with Buzzstream I just use it to follow up themBy Order, these are the better tools for that
- Bing
Then you have
- Semrush
- Majestic
- Ahrefs
There is no a magic tool at least I don't know one. I use the API for all my tools
(Semrush, Majestic, Ahrefs and so on) to collect data, then organize it and repeat the process
over and over again at the beginning looks like a chaotic process but once you that over and over again start to recognize the patterns.It is a repetitive, tedious and time-consuming process that's why I created my own script.
And base on my experience the best SEO do the same (Create their own framework)In fact, this is how Moz was born. Started as SEOmoz agency
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
-
Hi, Lindsay, I'm glad I was useful and have brought something positive
In my case, I use Moz, Semrush, Majestic, Ahrefs and Raven Tools. All of them are really goods tools
How to determinate how many links you will need to rank on the first page?
Well, in that case, you have two options the manual, hard and slow way but very accurate and the easy and fast in my case I use the second to make a quick research like your case.
THE MANUAL WAY
With Moz
1. Select the keyword, in this case, we will use social security increase
2. Go to Moz Pro > Keyword Explorer > SERP Analysis
3. See full analysis and Export CSV
4. On that case, you will have the first 10 results for that specific keyword
5. Moz will give you this numbersMonthly Volume 1.7k-2.9k
Difficulty 62
Organic CTR 61%
Priority 616. Take every URL and run an audit with Open Site Explorer
In this case, the first result will be https://www.ssa.gov/news/cola/Domain Authority 94
Page Authority 78
It has 120 Root Domains
It has 462 Total LinksMake a deep analysis
Link Source
- External Links
- Internal Links
Link Type
- Follow
- No Follow
As an example
- Target > this page
- Link Source > Only External
- Link Type > Only Follow
Repeat the process over and over again until you get the job done
you can use Excel to collect the data or you can download the CVSWith Semrush
1. Select the keyword, in this case, we will use social security increase
2. Go to Semrush > Keyword Analytics > Organic Search Results > Export
3. Go to Semrush > Keyword Analytics > Keyword Difficulty ToolDifficulty 90.72
Volume 5904. Once you have downloaded all the URLs on Semrush (Top 10)
5. Analyze every one with Semrush
6. Semrush > Domain Analytics and again collect the data on excelWith those numbers, you will have the answer to your question
Keep in mind all those top 10 pages are big websites likeSo you will not beat them in this world or any other world or even on any other dimension
But you can use Moz, Semrush and Long Tail Pro to found some interesting long tail keywords easy to rank
and If you make your homework and re-write the content as memorable as you can
(I'm not a copywriter so I have someone in my team for that but base on my experience a good article can cost you 20$)
Found 10 or 20 Keywords focus on them, create outstanding content around those keywords found links prospect
and try to outreach them. At the end of the day, you will have a sustainable SEO strategy (Long Term SEO), not something that you pull a trick today and be gone from the search result tomorrow.NOTE: I run this tasks on an automated process (Use the API from Moz, Semrush, Majestic, ect_)_
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
-
Yes, very nice work Roman, thank you! I really appreciate your research and well thought out response.
Using your example...
I don't have Ahrefs, we use SEMRush. Pretty sure they have the same features overall. I also use Long Tail Pro, MOZ, Majestic, etc.
How did you determine this---> "You'll need backlinks from ~32 websites to rank in top 10 for this keyword"
Also, what tool or strategy did you use to determine link prospects? Where these the backlinks of those ranking for the keyword? We have buzzstream, it's a great tool for link prospecting as well.
Regarding adding lists, info graphics, statistic numbers, etc... that's on my Q1 to do list for sure. We just hired an in house developer/designer who's going to help me with this.
Thank you again!
-
Nice work, Roman.
What a generous and informative reply.
-
EGOL is right so I want to add some value to you, from my point of view (Personal Opinion based on my experience)
This is what I would do in your case
- Forget your blogs
- Analysis the articles of your main website blog
- Get some keywords useful for those post
- Make a link prospect to your post
- 20 good links pointing to a single article can give you more traffic than all your network together.
Let's take this article of your site as an example
https://www.medicarefaq.com/blog/social-security-benefit-increase-announced-2018/Lets take
social security increase as the main keyword and let see some numbers from Ahrefs- volume 4,300
- Traffic potential 31,000
- Keyword difficulty 27/100
- You'll need backlinks from ~32 websites to rank in top 10 for this keyword
So with this information in mind, I will create a list of link prospect
I made a 10 minutes research a get 150 Links Prospect with this Blog Ranking Criteria:- NOT A PBN
- Domain Authority
- Non-spun Content
- Unique IP's
- Free or Paid
- Type Of Link Given
- Indexed in Google
These are some examples of what I founded
findhealthtips.com > PA 31
diabeticlifestyle.com >Â PA 38
bhma.org > PA 39Another good tip is re-write the article and convert it into the most memorable article related to _social security increase,_with lists, infographics, and statistics numbers, maybe some videos of relevant people talking about the topic, internal and external links to related content.
I'm pretty sure if you do that will give you more brand awareness, mentions, authority and traffic that all your content network.
-
"Giving attribution links does not change the fact that they are duplicate and thin content. Why should Google index them? They do not provide anything new to the web. Google will either send these pages to the supplemental index or not index them at all. If they are indexed they will eventually fall out of the index."
I can see your point on this... and obviously that's the case with mine since only the content that's unique seems to be staying indexed.
"Yes, but if you can get unique and valuable links, you will be better off getting them pointing straight at your main domain."
Yes, I usually only focus on our main domain. Constantly looking for high authority links and guest blog opportunities.
-
Great answer!
-
"It's not considered duplicate or thin content because you're giving an attribution link back to the original content."
Giving attribution links does not change the fact that they are duplicate and thin content. Why should Google index them? They do not provide anything new to the web. Google will either send these pages to the supplemental index or not index them at all. If they are indexed they will eventually fall out of the index.
"Do you mean to get link value I need to have more authority backlinks to my blogspot or external links to other authority sites?"
Yes, but if you can get unique and valuable links, you will be better off getting them pointing straight at your main domain.
-
The strategy applied here is from SerpSpace syndication networks. From my experience, this strategy works well. It's not considered duplicate or thin content because you're giving an attribution link back to the original content. The blogspot pages that have indexed have shown to positively increase the page SERPs it's linking too within a day of indexing.
"And, the blogspot site will not pass any linkvalue to medicarefaq.com unless it has unique links into it from websites that are outside of your own network and not duplicates of websites that already link to medicarefaq.com."
Do you mean to get link value I need to have more authority backlinks to my blogspot or external links to other authority sites?
-
I visited a few of the post pages on the blogspot site. These pages appear to be simply the first few sentences from the medicarefaq.com website.  I don't think that Google will like these pages because they are: A) signposts for medicarefaq.com, B) duplicate content of medicarefaq.com, and C) thin content.
For the blogspot site to be an asset, the content needs to be unique and substantive. And, the blogspot site will not pass any linkvalue to medicarefaq.com unless it has unique links into it from websites that are outside of your own network and not duplicates of websites that already link to medicarefaq.com.
If these were my websites, I would put all of my time into medicarefaq.com and stop working on sites that merely link to it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
What Are Internal Linking Best Practices For Blogs?
We have a blog for our e-commerce site. We are posting about 4-5 blog posts a month, most of them 1500+ words. Within the content, we have around 10-20 links pointing out to other blog posts or products/categories on our site. Except for the products/categories, the links use non-optimized generic anchor text (i.e guide, sizing tips, planning resource). Are there any issues or problems as far as SEO with this practice? Thank You
Intermediate & Advanced SEO | | kekepeche0 -
How to build Domain Authority?
My site: https://www.fishingspots.com.au/ has started to drop Domain Authority in the past weeks, however less quality sites like http://silverstories.com.au/ are rising... I am not sure why? Is there someway I can understand why my site would suddenly start dropping authority?
Intermediate & Advanced SEO | | thinkLukeSEO0 -
Page Authority inherited by domain
Hi there, We (blog.fust.ch) created our blog at the same time as (blog.interdiscount.ch). We're both blogs of big brands in switzerland. But there is one big different. The interdiscount blog ranks much better than we do. But they not even optimize the blog for standard SEO points. Now i found the reason i guess. They get the complete page authority of the brand interdiscount.ch inherited. How is that possible, because we don't..? (Screens in attachment) Best regards Sandro K7zxz ndODO
Intermediate & Advanced SEO | | Sandro_Haag0 -
Index or noindex mobile version?
We have a website called imones.lt
Intermediate & Advanced SEO | | FCRMediaLietuva
and we have a mobile version for it m.imones.lt We originally put noindex for m.imones.lt. Is it a good decision or no? We believe that if google indexes both it creates double content. We definitely don't want that? But when someone through google goes to any of imones.lt webpage using smartphone they are redirected to m.imones.lt/whatever Thank you for your opinion.0 -
My website is not indexing
Hello Experts As i search site :http://www.louisvuittonhandbagss.com or just entering http://www.louisvuittonhandbagss.com on Google i am not getting my website . I have done following steps 1. I have submitted sitemaps and indexed all the site maps 2.i have used GWT feature fetch as Google . 3. I have submitted my website to top social book marking websites and to  some classified sites also . Pleae
Intermediate & Advanced SEO | | aschauhan5210 -
Company Blog at a different URL
Ok, I have been doing a lot of work over the past 6 months, disavowing low quality links from spammy directories to our company website, etc. Â However, my efforts seem to have had a negative, not positive effect. Â This has brought me back to reconsidering what we are doing as we have lost a good amount of traction on the nationwide Google rankings specifically. Considering our company blog - platinumcctv(dot)net - we have used this blog for a long time to inform customers of new products, software developments and then to provide them links to purchase those components. Â Last week, Â I revamped the nearly default wordpress theme to another on a piece of advice. Â However, someone told me that all of our links should be nofollow, even though it is a company blog because we have many links coming from this domain, and it could be found as spammy. Potato/Potato - But before I start the tedious task of changing every link to no follow on a whim, i searched a lot, but have found no CLEAR substantiation of this. Â Any ideas? Other recommendations appreciated as well! Platinum-CCTV(dot)com
Intermediate & Advanced SEO | | PTCCTV0 -
How do I index these parameter generated pages?
Hey guys, I've got an issue with a site I'm working on. A big chunk of the content (roughly 500 pages) is delivered using parameters on a dynamically generated page. For example: www.domain.com/specs/product?=example - where "example' is the product name Currently there is no way to get to these pages unless you enter the product name into the search box and access it from there. Correct me if I'm wrong, but unless we find some other way to link to these pages they're basically invisible to search engines, right? What I'm struggling with is a method to get them indexed without doing something like creating a directory map type page of all of the links on it, which I guess wouldn't be a terrible idea as long as it was done well. I've not encountered a situation like this before. Does anyone have any recommendations?
Intermediate & Advanced SEO | | CodyWheeler0