Getting Authority Social Blogs to Index
-
We have a few authority blogs that I manage to help increase our brand awareness and build power to our website. We have Blogspot, Wordpress, Tumblr & Typepad. Our content get's a summary syndicated to our authority blogs with an attribution link back to the original post. I also manually check them one a month to make sure it looks good and the content syndicated correctly. I even add unique content to these blogs once in awhile. I recently realized that the majority of the pages are not indexing. I added the blogs to our GSC & Bing webmasters and submitted the sitemaps. This was done on December 11th, as of now some pages indexed in Google and Bing says the sitemaps are still pending...
- Blogspot - 32/100 pages indexed
- Wordpress - 34/81 pages indexed
- Tumblr - 4/223 pages indexed
- Typepad - 3/63 pages indexed
Can anyone help me figure out why I can't get Google to index more pages or Bing to process the sitemaps timely?
-
Thank you!!!! I've printed up your responses and applying your suggestions to my 2018 strategy. I've been using similar methods but this really breaks it down and gives me what I need to make an actual organized game plan going forward.
-
what tool or strategy did you use to determine link prospects?
Buzzstream is really good tool, for me is a really good CRM to keep in order my
links prospect but is not even close to being a decent "links prospect generator"Please don't get me wrong Buzzstream is a nice tool, I use it regularly to organize
my links prospect but I do not generate them with Buzzstream I just use it to follow up themBy Order, these are the better tools for that
- Bing
Then you have
- Semrush
- Majestic
- Ahrefs
There is no a magic tool at least I don't know one. I use the API for all my tools
(Semrush, Majestic, Ahrefs and so on) to collect data, then organize it and repeat the process
over and over again at the beginning looks like a chaotic process but once you that over and over again start to recognize the patterns.It is a repetitive, tedious and time-consuming process that's why I created my own script.
And base on my experience the best SEO do the same (Create their own framework)In fact, this is how Moz was born. Started as SEOmoz agency
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
-
Hi, Lindsay, I'm glad I was useful and have brought something positive
In my case, I use Moz, Semrush, Majestic, Ahrefs and Raven Tools. All of them are really goods tools
How to determinate how many links you will need to rank on the first page?
Well, in that case, you have two options the manual, hard and slow way but very accurate and the easy and fast in my case I use the second to make a quick research like your case.
THE MANUAL WAY
With Moz
1. Select the keyword, in this case, we will use social security increase
2. Go to Moz Pro > Keyword Explorer > SERP Analysis
3. See full analysis and Export CSV
4. On that case, you will have the first 10 results for that specific keyword
5. Moz will give you this numbersMonthly Volume 1.7k-2.9k
Difficulty 62
Organic CTR 61%
Priority 616. Take every URL and run an audit with Open Site Explorer
In this case, the first result will be https://www.ssa.gov/news/cola/Domain Authority 94
Page Authority 78
It has 120 Root Domains
It has 462 Total LinksMake a deep analysis
Link Source
- External Links
- Internal Links
Link Type
- Follow
- No Follow
As an example
- Target > this page
- Link Source > Only External
- Link Type > Only Follow
Repeat the process over and over again until you get the job done
you can use Excel to collect the data or you can download the CVSWith Semrush
1. Select the keyword, in this case, we will use social security increase
2. Go to Semrush > Keyword Analytics > Organic Search Results > Export
3. Go to Semrush > Keyword Analytics > Keyword Difficulty ToolDifficulty 90.72
Volume 5904. Once you have downloaded all the URLs on Semrush (Top 10)
5. Analyze every one with Semrush
6. Semrush > Domain Analytics and again collect the data on excelWith those numbers, you will have the answer to your question
Keep in mind all those top 10 pages are big websites likeSo you will not beat them in this world or any other world or even on any other dimension
But you can use Moz, Semrush and Long Tail Pro to found some interesting long tail keywords easy to rank
and If you make your homework and re-write the content as memorable as you can
(I'm not a copywriter so I have someone in my team for that but base on my experience a good article can cost you 20$)
Found 10 or 20 Keywords focus on them, create outstanding content around those keywords found links prospect
and try to outreach them. At the end of the day, you will have a sustainable SEO strategy (Long Term SEO), not something that you pull a trick today and be gone from the search result tomorrow.NOTE: I run this tasks on an automated process (Use the API from Moz, Semrush, Majestic, ect_)_
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
-
Yes, very nice work Roman, thank you! I really appreciate your research and well thought out response.
Using your example...
I don't have Ahrefs, we use SEMRush. Pretty sure they have the same features overall. I also use Long Tail Pro, MOZ, Majestic, etc.
How did you determine this---> "You'll need backlinks from ~32 websites to rank in top 10 for this keyword"
Also, what tool or strategy did you use to determine link prospects? Where these the backlinks of those ranking for the keyword? We have buzzstream, it's a great tool for link prospecting as well.
Regarding adding lists, info graphics, statistic numbers, etc... that's on my Q1 to do list for sure. We just hired an in house developer/designer who's going to help me with this.
Thank you again!
-
Nice work, Roman.
What a generous and informative reply.
-
EGOL is right so I want to add some value to you, from my point of view (Personal Opinion based on my experience)
This is what I would do in your case
- Forget your blogs
- Analysis the articles of your main website blog
- Get some keywords useful for those post
- Make a link prospect to your post
- 20 good links pointing to a single article can give you more traffic than all your network together.
Let's take this article of your site as an example
https://www.medicarefaq.com/blog/social-security-benefit-increase-announced-2018/Lets take
social security increase as the main keyword and let see some numbers from Ahrefs- volume 4,300
- Traffic potential 31,000
- Keyword difficulty 27/100
- You'll need backlinks from ~32 websites to rank in top 10 for this keyword
So with this information in mind, I will create a list of link prospect
I made a 10 minutes research a get 150 Links Prospect with this Blog Ranking Criteria:- NOT A PBN
- Domain Authority
- Non-spun Content
- Unique IP's
- Free or Paid
- Type Of Link Given
- Indexed in Google
These are some examples of what I founded
findhealthtips.com > PA 31
diabeticlifestyle.com >Â PA 38
bhma.org > PA 39Another good tip is re-write the article and convert it into the most memorable article related to _social security increase,_with lists, infographics, and statistics numbers, maybe some videos of relevant people talking about the topic, internal and external links to related content.
I'm pretty sure if you do that will give you more brand awareness, mentions, authority and traffic that all your content network.
-
"Giving attribution links does not change the fact that they are duplicate and thin content. Why should Google index them? They do not provide anything new to the web. Google will either send these pages to the supplemental index or not index them at all. If they are indexed they will eventually fall out of the index."
I can see your point on this... and obviously that's the case with mine since only the content that's unique seems to be staying indexed.
"Yes, but if you can get unique and valuable links, you will be better off getting them pointing straight at your main domain."
Yes, I usually only focus on our main domain. Constantly looking for high authority links and guest blog opportunities.
-
Great answer!
-
"It's not considered duplicate or thin content because you're giving an attribution link back to the original content."
Giving attribution links does not change the fact that they are duplicate and thin content. Why should Google index them? They do not provide anything new to the web. Google will either send these pages to the supplemental index or not index them at all. If they are indexed they will eventually fall out of the index.
"Do you mean to get link value I need to have more authority backlinks to my blogspot or external links to other authority sites?"
Yes, but if you can get unique and valuable links, you will be better off getting them pointing straight at your main domain.
-
The strategy applied here is from SerpSpace syndication networks. From my experience, this strategy works well. It's not considered duplicate or thin content because you're giving an attribution link back to the original content. The blogspot pages that have indexed have shown to positively increase the page SERPs it's linking too within a day of indexing.
"And, the blogspot site will not pass any linkvalue to medicarefaq.com unless it has unique links into it from websites that are outside of your own network and not duplicates of websites that already link to medicarefaq.com."
Do you mean to get link value I need to have more authority backlinks to my blogspot or external links to other authority sites?
-
I visited a few of the post pages on the blogspot site. These pages appear to be simply the first few sentences from the medicarefaq.com website.  I don't think that Google will like these pages because they are: A) signposts for medicarefaq.com, B) duplicate content of medicarefaq.com, and C) thin content.
For the blogspot site to be an asset, the content needs to be unique and substantive. And, the blogspot site will not pass any linkvalue to medicarefaq.com unless it has unique links into it from websites that are outside of your own network and not duplicates of websites that already link to medicarefaq.com.
If these were my websites, I would put all of my time into medicarefaq.com and stop working on sites that merely link to it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page getting indexed and not the main page!
Main Page: www.domain.com/service
Intermediate & Advanced SEO | | Ishrat-Khan
Duplicate Page: www.domain.com/products-handler.php/?cat=service 1. My page was getting indexed properly in 2015 as: www.domain.com/service
2. Redesigning done in Aug 2016, a new URL pattern surfaced for my pages with parameter "products-handler"
3. One of my product landing pages had got 301-permanent redirected on the "products-handler" page
MAIN PAGE: www.domain.com/service GETTING REDIRECTED TO: www.domain.com/products-handler.php/?cat=service
4. This redirection was appearing until Nov 2016.
5. I took over the website in 2017, the main page was getting indexed and deindexed on and off.
6. This June it suddenly started showing an index of this page "domain.com/products-handler.php/?cat=service"
7. These "products-handler.php" pages were creating sitewide internal duplicacy, hence I blocked them in robots.
8. Then my page (Main Page: www.domain.com/service) got totally off the Google index Q1) What could be the possible reasons for the creation of these pages?
Q2) How can 301 get placed from main to duplicate URL?
Q3) When I have submitted my main URL multiple times in Search Console, why it doesn't get indexed?
Q4) How can I make Google understand that these URLs are not my preferred URLs?
Q5) How can I permanently remove these (products-handler.php) URLs? All the suggestions and discussions are welcome! Thanks in advance! 🙂0 -
Google not indexing images
Hi there, We have a strange issue at a client website (www.rubbermagazijn.nl). Webpage are indexed by Google but images are not, and have never been since the site went live in '12 (We recently started SEO work on this client). Similar sites like www.damenrubber.nl are being indexed correctly. We have correct robots and sitemap setup and directions. Fetch as google (Search Console) shows all images displayed correctly (despite scripted mouseover on the page) Client doesn't use CDN Search console shows 2k images indexed (out of 18k+) but a site:rubbermagazijn.nl query shows a couple of images from PDF files and some of the thumbnails, but no productimages or category images from homepage. (product page example: http://www.rubbermagazijn.nl/collectie/slangen/olie-benzineslangen/7703_zwart_nbr-oliebestendig-6mm-l-1000mm.html) We've changed the filenames from non-descriptive names to descriptive names, without any result. Descriptive alt texts were added We're at a loss. Has anyone encountered a similar issue before, and do you have any advice? I'd be happy to provide more information if needed. CBqqw
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
My website is not indexing
Hello Experts As i search site :http://www.louisvuittonhandbagss.com or just entering http://www.louisvuittonhandbagss.com on Google i am not getting my website . I have done following steps 1. I have submitted sitemaps and indexed all the site maps 2.i have used GWT feature fetch as Google . 3. I have submitted my website to top social book marking websites and to  some classified sites also . Pleae
Intermediate & Advanced SEO | | aschauhan5210 -
Blog comments - backlinks - question
Hi, I see that many good websites have backlinks from very good blogs/sites which are relative. What I noticed that everyone use their real name or generic name in comments. They do not use the keyword for the name. So later they get backlinks with anchor text of their names... So, my question is this good technique ? Do I have any benefits from these backlinks for my website ? With such a technique, whether it is enough just to leave your real name or may I periodically put the keyword for the name ? Thank you
Intermediate & Advanced SEO | | Ivek990 -
How to get content to index faster in Google.....pubsubhubbub?
I'm curious to know what tools others are using to get their content to index faster (other than html sitmap and pingomatic, twitter, etc) Would installing the wordpress pubsubhubbub plugin help even though it uses pingomatic? http://wordpress.org/extend/plugins/pubsubhubbub/
Intermediate & Advanced SEO | | webestate0 -
Do comment links on blogs help the blog itself rank?
Hi I have a blog - Carzilla.co.uk - and it keeps getting what are pretty obviously spam comments with links to unconnected websites of various quality. The blog is quite new and not ranking highly in SERPs for anything in particular yet. So my question is, is it better to let some of these comments through so google can see activity on the site? Or do spammy comments with links make the site look like a link farm? Any advice on what my policy should be - purely from a Google serps perspective - would be great.
Intermediate & Advanced SEO | | usedcarexpert0 -
My Job Site is having Indexing Issues
I have 2 job sites that I am managing and working on. One of the sites has  a great deal of job vacancies and expired job pages that have been indexed. This one below: http:// job search.cctc .com/cctc Jobsearch/expandedjobsearch.do This job site does not have any job pages index: http://www.cross countryallied. com/ctAlliedWebSite/ travel-nurse-jobs/job-search.jsp Why and what can I do to get the dynamic pages index and ranking? Any help tips would be much appreciated. Thanks
Intermediate & Advanced SEO | | Melia0 -
Google indexing flash content
Hi Would googles indexing of flash content count towards page content? for example I have over 7000 flash files, with 1 unique flash file per page followed by a short 2 paragraph snippet, would google count the flash as content towards the overall page? Because at the moment I've x-tagged the roberts with noindex, nofollow and no archive to prevent them from appearing in the search engines. I'm just wondering if the google bot visits and accesses the flash file it'll get the x-tag noindex, nofollow and then stop processing. I think this may be why the panda update also had an effect. thanks
Intermediate & Advanced SEO | | Flapjack0