IF you click on the "fiter" button on the "just discovered" tab in OSE, page returns message "no links found."
This is weird because that message displays even when you didn't actually filter for anything, but just clicked on the button.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
IF you click on the "fiter" button on the "just discovered" tab in OSE, page returns message "no links found."
This is weird because that message displays even when you didn't actually filter for anything, but just clicked on the button.
Hi David,
I think that wildcard is correct, because we want to disallow all pdf files.
This is the same robots file as our other domain, avantcredit.com, and I was able to set up that campaign successfully on Moz.
Is there anything else that could be causing the issue?
I have a new url, and I'm trying to create a new campaign for it.
But in first step when i enter the domain, an error message pops up saying the url is invalid. could you help?
My site is very new (~1 years old), but due to good PR we have gotten some decent links and are already ranking for a key term.
This may be why someone decided to start a negative SEO attack on us.
We've had less than 200 linking domains up until 2 weeks ago, but since then we have been getting 100+ new domains /day with anchor texts that are either targeted to that key term or are from porn websites.
I've gone through the links to get ready and submit a disavow... but should I do it?
My rankings/site traffic has not been affected yet.
Reasons for my hesitations:
1. Google always warns against using the disavow, and says "you shouldn't have to use it if you are a normal website." (sensing 'guilty-until-proven')
2. Some say Google is only trying to get the data to see if there are any patterns within the linking sites. I don't want the site owners to get hurt, since the villain is someone else using xrumer to put spammy comments on their site.
What would you do?
Thanks Kate! This is really helpful. I guess we will go with no hreflang tag, and just .com and .co.uk sites
yea i don't see why it would be a bad thing.
hmmm not sure what's going on, so would help to get more granular details.
I would suggest downloading the latest links from GWT, and actually looking at the urls that is linking to you.
The "Just Discovered Links" tab on OSE is really good too. I'm using my ahrefs tool less and less.
I would run this every week, since this tab only goes so far back.
With the SEO community focusing a lot on "online PR" now, I was wondering if there were great sites around PR like MOZ or SearchEngine Land.
Does anyone know any?
Didn't know about that last tag!
haha you and Lesley are giving me 2 different answers, so I'm even more confused!
Hopefully more people can chip in their comments?
Yea we are currently working on producing different content, including complete separate content + converting US to UK english, but there are some pages where duplicates are unavoidable.
I also thought this tag was not to handle duplicate content at all, but when you think about it more that is essentially what it is doing - it exists for websites that have the exact same content in 2 separate languages. It's just a bit confusing when you have US and UK, since the language is the same, but there are still separate hreflag tags for them...
The title says it all - if i have duplicate content on my US and UK website, will adding the hreflang tag help google figure out that they are duplicate for a reason and avoid any penalties?
It's an interesting idea. I think i'm going to side with having multiple pages.
1. As long as your site architecture is done right, even a new page should be supported with good authority from the domain
2. The old post can still have good content on it and receive long-tail visits that the new page will not receive
3. Wouldn't the user experience be much better for a site that you can move around in, not a 30,000 word page? Your bounce rate might seem abnormally high too, which will affect rankings.
I know if you click on 'download latest links' you can get the actual page that links to you, not the domains. However, I'm not sure how to get the full profile.
agree with Dan. If it's not any of the above and you are sure none of the other links are spam, you might not have been hit with penguin 2.0, but some other penalty.
I would check with other tools as well. Use as many tools you have at your disposal and create a comprehensive list of backlinks.
Hi,
I wouldn't block these pages from being crawled by search engines. Category pages are great for making sure more link juice flows to your deeper blog pages and making sure they get indexed. I believe author links give authorrank to the corresponding blog post too. I'm not sure about what you mean by 'read more' links. May I ask why you are concerned that these pages hurt SEO?
Thanks. The pdf is a good idea. But wouldn't you have to ask these blog owners to put a canonical on THEIR page pointing to mine?
I'm thinking of hosting a giveaway, and promoting this to bloggers. Thinking of how to go about this, I've run into a sort of a road block. I'm thinking the best way would be to attach a flyer detailing the giveaway so that bloggers have easy access to the information. However, I fear that a lot of them will just copy + paste from the flyer straight to a blog post - which will create a lot of duplicate content.
Anybody in the community willing to share their experiences and how they were able to go around the duplicate content issue?
Are all of your links directing to the www version? Then this is what would happen. If you've rel=canonical'ed to the www version, and build links to the non www version, you will essentially build all link juice to the non-www and redirect it to the www, in this process you lose some link juice like you do in 301 redirects.
Other than prweb and prnewswire, I also recommend marketwire.
In the most recent SMX, Matt Cutts talked about how google doesn't count the press release links because they are essentially paid links.
The most important thing you need to think about when choosing the press distribution sites is how wide of a net it casts, so that your story actually gets picked up by journalists and they write about you, hopefully with a link. The actual link from the press release (ex. prweb.com/ xxxxx) has no SEO value, in my opinion.
There are no limits, but as it is a rule of thumb that only the first h1 will count as h1, I'm sure Google has a way to discredit any xth h2 and beyond. I would use common sense to pick the more important title tags as h2 and classify the others as h3, and h4 so that you are sure that Google gives more value to the the more important titles than not.
You should definitely build it to the www. if that is your primary domain. The Moz Keyword Analysis only shows it without the www because of cosmetic purposes - it doesn't mean that the links were all built from non-www's. You can confirm this by looking at the inbound links tab.