Hi,
Would you be able to post a URL? Maybe there is something you have missed with the implementation. More eyes, see more.
If you cant/wont post it publicly, send me a PM.
Keszi
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi,
Would you be able to post a URL? Maybe there is something you have missed with the implementation. More eyes, see more.
If you cant/wont post it publicly, send me a PM.
Keszi
Hi there,
This is one of the reasons why the hreflang tag was created. You can learn more about this here: https://moz.com/learn/seo/hreflang-tag
Also I would advise you to read Dave's post about it: https://moz.com/blog/hreflang-behaviour-insights
I hope this will clear it a little bit.
Keszi
I would terminate with them. (even I would use those 6 weeks to have them clean up the stuff they have done).
Read through this post by Rand: https://moz.com/blog/google-says-yes-you-can-still-sculpt-pagerank-no-you-cant-do-it-with-nofollow .
Quite old, but still valid in my opinion. So if you want to nofollow the tags only because of sculpting the follow links value, then there is no sense making it.
We are using a similar approach with pages we do not want to index, but they are beneficial for our visitors. (similar case to yours)
Hi Thomas,
I remember that there was a mozinar (https://moz.com/webinars/advanced-wordpress-seo) that helped me a lot answering these kind of questions. I think that would be useful for you too.
Regarding your question: if it is useful for the visitors, but you believe it causes (in any way) duplicate content.... I would set them to noindex, follow.
If you have the time, watch the mozinar, it describes the issue quite in detail.
I hope it will help you.
Keszi
Hi there,
It can take quite some time until you will see effect of the disavowed links file. For example in one of our projects I've seen it took almost 6 months until we have seen some recovery after submitting the disavow file.
Also what I'd advise you to check if your rankings were only effected by the lower quality links you had in the past. One tool that might help you with that is Barracuda's Panguin tool: http://barracuda.digital/panguin-tool/
Basically it will use your Google Analytics data, and overlap with the known google updates. That could also highlight if your website was hit by another update meanwhile.
I hope this helps a little-bit.
Keszi
Hi Joginder,
I would propose to contact the help team with this kind of inquiry. The fact is, that we, as a community, wont be able to help you out with such a technical issue (but engineers from Moz can take a look from their part).
Just go and write an email to them: help @ moz.com
I am convinced that they will quickly answer your question there.
Greetings,
Keszi
Hi Kevin,
1. Could you share the URL with us? That way we could take a closer look.
2. There are a few articles about the most common redesign issues, Thomas has listed them here: http://moz.com/community/q/website-redesign-seo-checklist
Maybe you will find something there.
I hope it helps! Keszi
It was the same in our case. Could do nothing more, then just disavow him.
Unfortunately there are people who fall in his trap, and pay for the removal.
His websites are not indexed in Google. Therefore I hope Google doesn't count any links/anchor from his sites.
Did you implement something lately? Just asking, because interesting fact is, that you have quite an old cache for this page only (31st of March). On other pages you have a fresher cache in Google.
Nor Majestic or Moz doesn't show any links towards this page, so it could not be influenced by that.
Did you try fetching it from Google Webmaster tools?
That Kumar guy is funny. We also had to deal with him in the past. He asks for a certain amount of money /links removed.
We have tried talking to him first, in order to remove the urls. But in the end his websites ended up in our disavow file. (and his response was: I do not care)
Keszi
SEO tools are like personal assistants:
That is my shortlist
I personally use Screaming Frog + Moz, + Majestic + Xovi
I hope this helps.
Gr., Keszi
Hey,
Meanwhile Dan gave you quite a nice answer below.
Gr., Keszi
Shaq, the reason it is not higher is, because it wasn't updated at all since 2013.
Regarding DA, check again the http://moz.com/learn/seo/domain-authority link that I have provided in my previous answer.
Moz clearly states, that you can not directly(or it can be a hard task to) influence this metric (check the SEO best practices part in the bottom of the article).
Quote:
"Unlike other SEO metrics, Domain Authority is difficult to influence directly. It is made up of an aggregate of metrics (MozRank, MozTrust, link profile, and more) that each have an impact on this score. This was done intentionally; this metric is meant to approximate how competitive a given site is in Google.com. Since Google takes a lot of factors into account, a metric that tries to calculate it must incorporate a lot of factors, as well.
The best way to influence this metric is to improve your overall SEO. In particular, you should focus on your link profile—which influences MozRank and MozTrust—by getting more links from other well-linked-to pages."
As Alick300 has mentioned, Pagerank is a metric that we do not use anymore.
I would advise you to check the metrics that Moz is using: PA, DA, mR, DmR, and in case you are a Pro member, you could also check mT and DmT:
Be aware that these are only metrics and in some cases increasing these metrics doesn't necessarily mean that you will rank better: Correlation <> Causation (check the article I have linked into the spam score, by Rand).
Also you could check the whiteboard Friday: http://moz.com/blog/understanding-and-applying-mozs-spam-score-metric-whiteboard-friday
I hope this will be helpful to you.
Keszi
No worries! I have digged into the canonicals questions so many times, thought I will clarify.
Basically in me belief canonical links are a two edged weapon; if miss-used, they can lead to a total disaster.
What I would add to Lewis's answer, just to make it clear, canonicals do pass link juice. If you are interested, you can check this article from Dr. Pete: http://moz.com/blog/an-seos-guide-to-http-status-codes
But for sure the 301 would be the best practice in your case. Because there would be no good use to have the pictures on 2 urls (the kws and the non kws versions).
I hope this helps.
Keszi
Hey,
there are so many things that could influence your rankings that is hard to tell if it is only from the links, or there are other factors also.
If you see a traffic drop from a specific date, then you could check for the Panguin tool from Barracuda: http://www.barracuda-digital.co.uk/panguin-tool/
And if the traffic drop is on a specific update date, then you could investigate that specific update, and start from there.
Let me know, if you have further questions.
Keszi
What I would try out, is including a self reference also.
So next to your hreflang there would appear a new hreflang, such as:
When we have implemented hreflang, we have also used a self reference, and I do not have any issues there.
I hope that helps.
Keszi
Hi,
Try out your page with Flang tool. Maybe it will give a better insight what you are missing out.
We cannot do more without a link that we can inspect
BTW: common mistake, do you have a self reference? It is needed in order to hreflang to work well.
Keszi
Try getting back to the basics. Create an audit for the website, for the webpages that you'd like to rank for. Check for accessibility.
For example on one of our projects a simple development error caused our new content to be hidden from search engines (a simple content tab that had an issue with Javascript).
Sometimes getting back to the basics and creating an auditing for our website could highlight issues that stop us from ranking on top 10 spots.
Domain Authority (DA) - indicates how likely a domain is to rank based on how authoritative search engines consider it to be based on its backlink profile.
Page Authority (PA) - indicates how likely a page is to rank based on how authoritative search engines consider it to be because of which sites link to it
I have quoted these from the Moz Glossary.
It is important to highlight the usage of the word Likely! Having higher DA or PA doesn't necessary mean, that you are going to outrank a specific website. DA and PA are mostly based upon the links that a specific website/webpage get. (at least this is how I look at it).
There are so many other things that could influence a webpage's rankings.
I hope it helped,
Keszi
My favorite for yesterday came from Barry Scwartz: https://www.seroundtable.com/google-search-quality-lead-april-fools-20084.html
I was crying out loud while reading this.
In any case, I would recommend an onsite audit. You can do it yourself, or ask an outsider in order to get a fresh perspective of your website.
Let me know, if you need help.
Greetings, Keszi
Yes, Wordpress SEO is the Yoast plugin: https://wordpress.org/plugins/wordpress-seo/
Hi,
I'd quote from Moz: http://moz.com/learn/seo/domain-authority
"How do I influence this metric?
Unlike other SEO metrics, Domain Authority is difficult to influence directly. It is made up of an aggregate of metrics (MozRank, MozTrust, link profile, and more) that each have an impact on this score. This was done intentionally; this metric is meant to approximate how competitive a given site is in Google.com. Since Google takes a lot of factors into account, a metric that tries to calculate it must incorporate a lot of factors, as well.
The best way to influence this metric is to improve your overall SEO. In particular, you should focus on your link profile—which influences MozRank and MozTrust—by getting more links from other well-linked-to pages."
So answering your question, Yes, you can improve the DA, while working on your SEO.
In general, in my belief, the 301 redirect could have influenced the score, but I would rather check the rankings and organic evolution of the website after the redirect instead of checking only the DA. If the organic results improved, I wouldn't worry to much about the DA itself. (maybe it is just only a temporary glitch until Moz recalculates it's value based on the new information they get).
Keszi
Carlos, I will take a look and send you a PM.
In general there is no quick way for SEO (if you want to play it safe), especially if we are talking about a competitive market.
Hi Carlos,
SEO in general isn't a black&white, "I implement and it is working" science. The results will appear with time.
If I had to compare SEO I'd do it like this: It is an ongoing&endless chess game, with unlimited number of players who can move at the same time. And there is Google from the other hand, who can change the rules of the game at any point.
Seeing results in this industry (or as you have mentioned it: SEO time) will be quite different with every project. Because it doesn't only depend on you and/or Google. There are the other websites who are constantly evolving and developing.
I hope you will see your results soon.
Keszi
Oh, I know the feeling! I'd say, try to make an audit for the site, then make a priority list and take it step-by-step.
Let me know, if you need help.
Hi Ravi,
It is not free, but worth the money($40/month), I think: https://www.distilled.net/u/
There you have a lot of modules which you can learn and test your knowledge.
Keszi
I use it on daily basis, because it gives me quite a good image of what we might miss with manual checking. I hope it will do the trick for you too.
BTW: good to know article for Screaming Frog: http://www.seerinteractive.com/blog/screaming-frog-guide/
That article might speed up things for you and Screaming Frog (for future use )
Hi Kelly,
There are a few nice articles about this topic. I will point out a few, maybe these checklists will help you determine what could have gone wrong with the redesign:
I hope it helps! Keszi
Hi Carl,
What I'd try to do, is to make some crawls via Screaming Frog to the website, try to analyze how your content is crawled, what could have gone wrong.
Also try to keep a timeline of implementations on the website. (I work at a quite big website, and keeping everything organized for myself helps me analyze issues as yours. I will try to make a copy of my excel timeline and post it here in the following days, if you'd like to )
Another thing that you could do, is to fetch and render the pages. Check for anything out of the ordinary.
These are a few ideas of where I would start digging.
I hope it helps. Keszi
Hi Ravi,
I personally use Screaming Frog to create the sitemaps. I create an advanced exclude/include list for the crawler, then run a crawl on my website and export the sitemap.
After the sitemap is created I double check it in an editor like notepad++
I hope this will work out for you too.
Keszi
Hi David,
What I'd do is to check first of all in there is any huge traffic fluctuation with coincidence with any of the Google Updates released.
An easy way to do this is to use the Pangiun Tool from Barracuda: http://www.barracuda-digital.co.uk/panguin-tool/
Maybe that will help recognize what the issue could be.
If not, try to check what implementations have been done in the past few months.
I hope this helps! Greetings, Keszi
Glad to hear it got resolved so quickly! Keep up the good job!
Hi there,
IMO the two pages are near duplicates and I have seen pages that have been indexed in such cases (on one of our own projects). Then we have taken in consideration the canonical, and issue was solved. This is why I have recommended it, but I am opened to other solutions also.
Gr., Keszi
Hi Luke,
I would implement a canonical in any situation where there can be parameters in the url. This way telling any search engine to only consider one (the original) version of that page.
Gr., Keszi
Hi!
First of all welcome to the community!
Regarding your question, I would implement a canonical which would self reference:
Regarding the hreflang implementation. You will always need to make a self reference in order to make it work. So for example on an inner page, called Sample you would have on all languages:
I hope this helps
Gr., Keszi
Hi Tamir,
First of all welcome to the Moz community!
I would propose you to take a few hours and watch the following two mozinars:
The advanced Wordpress SEO helped me a lot in understanding how to approach the technical implementations on Wordpress for SEO purposes. It is quite old, but I do not think it is outdated at all.
I hope these help respond your questions.
Gr., Keszi
Hi Daniel,
Having a friendly/readable url structure will surely help you in several ways.
You can read more about it here:
In the second article check the "Url construction guidelines" part.
But before you start implementation on such a development, try to check how you could avoid an url structure redesign failure. (Such as forgetting about 301 redirects from the old structure to the new one).
I hope it helped. Gr. Keszi
It is possible that you still have it in the sitemaps. Did you check that?
Gr., Keszi
Hey,
IMO when it comes to redirects/merges on a website, the first thing that you want to ask yourself, whether the content that you are merging is more related to your homepage or inner page?
What I am trying to point out is, if you have two websites: one which was selling independent products/product groups, the other the branded website, I would create a new inner page/select an existing inner page that already would host the new products on the branded website and redirect it towards that page.
If we are talking about two websites that were hosting almost the same products, then I would go for a more detailed redirect, such as independent product pages get redirected to their correct version on the selected domain. If this scenario cannot be applied, only then I would redirect whole domain to the homepage.
Like mentioned above, this is my opinion, but I hope it helps you.
Gr., Keszi