Thanks for the answers. Haven't had those figures earlier but I guess you are right. Didn't handle them correctly.
I guess it was a mistake from my part. Thanks again.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: CEO
Company: OG Group AB
Thanks for the answers. Haven't had those figures earlier but I guess you are right. Didn't handle them correctly.
I guess it was a mistake from my part. Thanks again.
Hi,
I just recently started getting errors when using the API. The request back contains zeros for all the keys except PDA which is a new to me. Haven't changed anything in the code and the authorization seems to work fine. Is this something on my end? However, I don't get for all the requests, perhaps half of the time.
Example of the return: {"fmrp":0,"fmrr":0,"pda":27.7812456171725224,"ueid":0,"uid":0,"uipl":0,"umrp":0,"umrr":0,"upa":1}
Thanks in advance!
/Mathias
Before the change I could see them under Traffic -> referral. I'm not sure how it will be shown now, I have to get back to you regarding that when I have gotten a couple of visitors.
I was hoping for some more input regarding this. I appreciate your input David and will change the meta-robots around a bit. But thought I would get some more input, perhaps someone that wanted the categories to be index, follow as I have heard somewhere else.
Wonderful solution. Don't know why I didn't think about it myself. Thanks a lot!
If anyone has the same problem and want the buster on all pages you can use:
if (top.location!= self.location) { top.location = self.location.href }
I only needed it for one page. Will be interesting to see how long they will feature us. One thing though, if the visitors of the other site has javascript disabled, this won't work.
I just noticed that one of our sites are "featured" on another site. The only this on this site is an iframe that has our site on it, i.e.:
<iframe name="ForwardFrameqAW52htK" src="http://www.oursite.com/subdirectory"></iframe>
This got me thinking of two question. First, is there any dangers for us to be included like this?
And, secondly, why would they do like this? To make people believe they have contents and get traffic, and after that change the contents to their own? Or anything else?
For some of our site we use Wordpress, which we really like working with. The question I have is for the categories and authors pages (and similiar pages), i.e. the one looking: http://www.domain.com/authors/. Should you or should you not use follow, noindex for meta-robots?
We have a lot of categories/tags/authors which generates a lot of pages. I'm a bit worried that google won't like this and leaning towards adding the follow, noindex. But the more I read about it, the more I see people disagree. What does the community of Seomoz think?
Perhaps you are right about the links but it feels strange that we have dropped that far down in that case. We have around PA 57 and DA 48 with almost 250 root domains linking which is quite high comparing to the competitors. Even though some of the links got devaluated, we shouldn't drop 100-200 places for many of our more difficult keywords.
I will continue working with the links, and go through the onpage seo to see if I can turn things around. Would be sad to let the site go.
For one of our biggest sites I noticed a big difference in ranking drops in the beginning of december. The site is in Swedish so the google updates are a bit different than for google.com. For many of our keyword we dropped far down in the SERPs. Most of the keywords we had at top 10, or top 20, but these got first dropped down to 100/150+ but since then they have started to gain a bit, not near the previous results though.
For our main keyword (a really high competitive keyword) we got dropped totally (+200). The site is about 7 years old, have good content, regulary updated (1-2 news a day, 2-4 new articles a week).
Our traffic is still pretty good, actually almost the same as previous the big drop for the "hard" keyword. We seem to get more traffic for our long tail keywords instead, which is good. I would like be able to get traffic from both types of keywords though.
One more thing to add. I have notice that a couple of our competitors, both in this area and other, also got hit in the same way. This should mean that some sort of google update/filter has hit us.
So, in short, my question is if anyone knows about a update/filter in the beginning of december for the Swedish market. And, how can we work ourselves back?
Edit: Also, the site ranks number one for the titel of the page and for the titel of the hard keywords, the site also ranks number one. The "right" pages also gets in the top when I do a seach for "site:domain.com keyword".
Thanks for the answers. Haven't had those figures earlier but I guess you are right. Didn't handle them correctly.
I guess it was a mistake from my part. Thanks again.
Looks like your connection to Moz was lost, please wait while we try to reconnect.