MBG Tracker...how to use it?
-
So I am a new blogger that has been submitting guest blog posts to a number of different blogs. It was recommended that I use the MBG Tracker so I can track the back links. The problem is that I am totally lost on how to use this tool. As I said before I am new to this whole thing and I am not really sure what constitutes a "base link" and a "back link." In the author bylines we are linking to different pages within a larger website. If anyone can help me I would really appreciate it!
-
Thanks for the reply Ann! David, SEOmoz and MyBlogGuest are different products, you'll probably find the best help for MBG talking with Ann or using her support forums.
-
Let me try to be very clear. Please email if you have more questions (I got both of your emails, so feel free to use the email)
When guest blogging, are you promoting ONE site in your byline in each of the guest posts? There can be different pages from one site, but one and the same domain.
For example:
- Project name: SEOmoz
- Base domain: seomoz.org
Then when you decide to upload this page, for example: http://www.geekwire.com/2012/seomoz-secret-startup-success-rand-fishkins-tips/ - the tracker will automatically identify the backlink to track - http://www.seomoz.org/dp/seattle-startup-marketing - because it has seomoz.org base domain.
By providing the base domain, you make your own job easier because you don't have to manually provide the tool with the backlink to track!
-
So the base link would be the link to the blog post and the back link is the link that goes back to my site? Is this correct? I had seen the site that you suggested but like I said I was confused by the base link and the back link.
-
Base link is a normal submission with a link. Back link has more value with guest blogging.
HOW DOES THE BACKLINK MONITORING TOOL WORK?
This is the cheapest and simplest alternative I am aware of:
- Receive instant emails if any of your links were removed or nofollowed;
- Track your guest post social popularity (Likes and Tweets)
- Measure traffic from each of your guest posts (for that you’ll need to install a tracking code)
- Identify most efficient backlink sources (for example, to find places where you canguest post again).
To start tracking your guest posts or links, you’ll need to create a new project and specify your base URL. For all pages you add within one project, this very base URL will be monitored automatically (even if you don’t specify which link within a given page you want to track).
You can add / remove columns from the table (to adjust a clearer view) and sort by each column. You will also receive weekly summaries of your missing links and most popular link sources.
Handy url for more info..... http://www.seosmarty.com/meet-mbg-tracker-the-simplest-and-cheapest-way-to-monitor-your-backlinks/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Using NoIndex Tag instead of 410 Gone Code on Discontinued products?
Hello everyone, I am very new to SEO and I wanted to get some input & second opinions on a workaround I am planning to implement on our Shopify store. Any suggestions, thoughts, or insight you have are welcome & appreciated! For those who aren't aware, Shopify as a platform doesn't allow us to send a 410 Gone Code/Error under any circumstance. When you delete or archive a product/page, it becomes unavailable on the storefront. Unfortunately, the only thing Shopify natively allows me to do is set up a 301 redirect. So when we are forced to discontinue a product, customers currently get a 404 error when trying to go to that old URL. My planned workaround is to automatically detect when a product has been discontinued and add the NoIndex meta tag to the product page. The product page will stay up but be unavailable for purchase. I am also adjusting the LD+JSON to list the products availability as Discontinued instead of InStock/OutOfStock.
Technical SEO | | BakeryTech
Then I let the page sit for a few months so that crawlers have a chance to recrawl and remove the page from their indexes. I think that is how that works?
Once 3 or 6 months have passed, I plan on archiving the product followed by setting up a 301 redirect pointing to our internal search results page. The redirect will send the to search with a query aimed towards similar products. That should prevent people with open tabs, bookmarks and direct links to that page from receiving a 404 error. I do have Google Search Console setup and integrated with our site, but manually telling google to remove a page obviously only impacts their index. Will this work the way I think it will?
Will search engines remove the page from their indexes if I add the NoIndex meta tag after they have already been index?
Is there a better way I should implement this? P.S. For those wondering why I am not disallowing the page URL to the Robots.txt, Shopify won't allow me to call collection or product data from within the template that assembles the Robots.txt. So I can't automatically add product URLs to the list.0 -
Client bought out shop but used existing phone number
We have a client in Nashville who opened his first location on Spring St., then later bought out PAC Auto to open a second location on Dickerson St. Lately, we noticed that the Dickerson location wasn't ranking. I found that the previous business owner at Pac Auto had already built up a good web presence and that sigh our client was using their old number. Basic NAP violation, ok, got it. But what to do next? I decided to update PACs citations with The Car People's business name and website. Where I was unable to edit or where listings were already claimed, I just reported PAC auto as closed. But yesterday I noticed not only was the Dickerson location still not ranking, but the Spring street location had indeed dropped several places too! (edit: I'm referring to local search results here as we don't own the site) What kind of beast have I stirred?! What kind of signals am I sending to Google that are devaluing the Spring st. location? Will things get worse before they get better? What can I do to make some progress on one without hurting the other? Is it worth trying to get the previous business owners logins (not likey)? Talk to The Car People about getting a new number (not impossible)? Is it worth trying to get the site in order to build separate landing pages for each location? Thanks in advance!
Technical SEO | | cwtaylor0 -
We are using Hotlink Protection on our server for jpg mostly. What is moz.com address to allow crawl access?
We are using Hotlink Protection on our server for jpg mostly. What is moz.com crawl url address so we can allow it in the list of allowed domains? The reason is that the crawl statistics gives our a ton of 403 Forbidden errors. Thanks.
Technical SEO | | sergeywin10 -
Windows Acces used for e-commerce site - help needed
Hello everybody, I am working on this e-commerce website built on windows access and it's a nightmare to change the html content on it.has anyone used it before? It doesn't allow me to change the content for the html tags even though it should and i don't have a clue about what to do. Thanks oscar
Technical SEO | | PremioOscar0 -
Why does everyone use bitly?
Why do people use bitly? I thought it was just a way to share a link on twitter if the link was too long in url. I see SeoMoz shares all their content with a bitly link. Even when they share it on Google+. Why?
Technical SEO | | JML11790 -
Rank tracker and Rankings report differs
Hi all Is this normal? I have set up a campaign for a site. Tracking a variety of keywords. For one of them, which is a quite important keyword I've been working on I've moved down one step in my rankings report. This is first of all weird because my on page optimization went from grade c to a, and even weirder beacuse if I run Rank Tracker tool on the keyword and the URL I see that I've moved up 6 steps, to 15 in Google. Kinda makes it hard to grasp if I'm on the right path or not! (I've checked and they are both results on google.dk, same URL and same keyword - exact)
Technical SEO | | Budskab0 -
NoIndex/NoFollow pages showing up when doing a Google search using "Site:" parameter
We recently launched a beta version of our new website in a subdomain of our existing site. The existing site is www.fonts.com with the beta living at new.fonts.com. We do not want Google to crawl the new site until it's out of beta so we have added the following on all pages: However, one of our team members noticed that google is displaying results from new.fonts.com when doing an "site:new.fonts.com" search (see attached screenshot). Is it possible that Google is indexing the content despite the noindex, nofollow tags? We have double checked the syntax and it seems correct except the trailing "/". I know Google still crawls noindexed pages, however, the fact that they're showing up in search results using the site search syntax is unsettling. Any thoughts would be appreciated! DyWRP.png
Technical SEO | | ChrisRoberts-MTI0 -
Using differing calls to action based on IP address
Hi, We have an issue with a particular channel on a lead generation site where we have sales staff requiring different quality of leads in different parts of the country. In saturated markets they require a stricter lead qualification process than those in more challenging markets. To combat the problem I am toying with the idea of severing very slightly different content based on IP address. The main change in content would be in terms of calls to action and lead qualification processes. We would plan to have a "standard" version of the site for when IP location can not be detected. URLs on this version would be the rel="canonical" for the location specific pages. Is there a way to do this without creating duplicate content, cloaking or other such issues on the site? Any advice, theories or case studies would be greatly appreciated.
Technical SEO | | SEM-Freak1