Hi there.
Can you specify what exactly you mean by "consolidate link prospects", please?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi there.
Can you specify what exactly you mean by "consolidate link prospects", please?
Uhmm.. Are you suggesting that SEO comes down to "just a matter of mentioning it in a blog post"? Man, we are all in the wrong business
Of course not! To rank for another country, you need to do every single step of SEO process with a focus on that country. If you are talking about making the same domain rank for several countries, then you gotta do all SEO steps + a ton of technical work to make sure that all your content is not duplicate and targeted.
Please, read this:
Hi there.
The question would if it's important to you. If it doesn't matter to you whatsoever, then let Google choose, however, make sure that when you build backlinks or leave link to your domain anywhere, use the one which google chose, just for consistency.
Hi there.
Have you read this? http://googlewebmastercentral.blogspot.com/2013/04/5-common-mistakes-with-relcanonical.html
That might answer some of your questions.
Now, did you allow some time for Google to "realize" canonical links after you made them live?
Hi there. We are using Hootsuite and it's working very well for us. Why do you not like it?
Hi there.
So, you say that website has been penalized due to being hacked, even though there is no manual action, no "this site may be hacked" under your domain in SERPs, and you base your decision on 404s and drop in rankings. Or, to rephrase, you explain drop in rankings by website being hacked, even though there is no proof of it. Hmmm... Still doesn't sound suspicious?
How about website's SEO? UX/UI? Crawlability? Your industry trend? Competitiors' SEO improvement? Your backlink profile? There are sooooo many other reasons rankings can be dropping besides website being hacked. Are all these things I listed in perfect shape?
Hi there.
It's not me suggesting it, it's Rand Fishkin (founder of MOZ) saying it - check the link I refer to in my answer. Second paragraph. (It's quoted in my answer as well)
Hi there.
Well, it surely confused me Yes, it's confusing to bots and seems kinda shady to me. If it's possible, do it straight forward - canonicalize to prefrred url.
I say keep the blog under the same directory, unless you produce it in different languages.
Hi there.
Everything seems good to me. Just make sure that you use proper hreflangs or canonicals for content, which can potentially be duplicate, make sure that you have proper/correct sitemap and there are no problems with crawlability and accessability.
Good luck
1. we got both - search for "Hyperlinks Media" or "Hyperlinks Media Houston"
2. Yes, it was Google My Business rep. We are GYBO partner, so, it was one of those guys, which, I believe, is the same as GMB rep.
Hi there.
We had the same problem and got it solved recently. First of all, make sure that all schema is correct. Also make sure you have verified g+ account for that business (apparently it's important). Also see if knowledge graph comes up if you search for "brandname reviews". And call Google. In our case, they found some technical issues on their side, and while they were fixing it, they gave us some ideas to apply on our side to improve local listings presence.
Hope this helps.
Hi there.
It's more of a technical problem, so rather than looking for an answer on a forum, please email to help@moz.com with explanation of your issue and account credentials - they'll look into it for you.
Hi there.
ok,
Large mount of crawled pages doesn't mean that every single page has been fully downloaded NOT from cache. If that would be happening, Google would bankrupt in two days. Most likely Google checks the difference in cached and live page and updates the information, rather than downloading the whole thing every time it crawls a page. That's why there is a "fetch as Google" and limit on crawl pages.
It would depend on which pages those are. If it's one of main pages (like index or main service) - it'd be pretty bad. If it's a page, buried under thousands of other pages, then no, it wouldn't be a problem, however, when google does get to that page, you cut yourself that crawl, since Google bots has limited resources per domain. Basically, you'd be wasting bots resources on that non-responsive page.
I'd recommend either fixing it or deleting that page if it's not important.
Cheers.
Hi there.
No, there is no other way than backlinks. Basically, the way Search Engines work is that they know what your website (domain) is about and than try to find the best page to match the query. One of the strongest signals engines look at is backlinks. So, if your home page has good backlink profile, secondary page doesn't have any, and Google understands that your website is about, let's say, apples, your index page will be ranking no matter how much you optimize content on secondary page.
You can try doing solid internal linking with exact matching anchor text, but I doubt it will help much. Just try to get couple good backlinks to secondary page and you'll be fine.
Hi there.
Well, Google, Bing and Yahoo are using completely different ranking algorithms, so, it's not surprise that your rankings differ. I don't think you really can optimize for all search engines. You just optimize for Google and hope that you rank on others as well. So, to answer your question - I don't think the problem is with Google. It's most likely with your website. Make sure that your website is well optimized (tech and non-tech side), you have good backlink profile, your content is outstanding and your social media presence is huge. Also check for possible algortihm's spam hits like penguin etc.
Hope this helps.
Hi there.
Hreflang is the way to go. No matter how you decide to organize domain - subfolder or subdomain, if actual content is the same, you'll have duplicate issues. Also canonical link is probably not the way to go, unless you want one of your contents (let's say canadian) not rank.
Hope this helps.
Hi again. I've seen it. Quite honestly I disagree with absolutes being a priority. The arguments, presented in that WBF don't really work for me against the pain in development (I believe she mentioned even more drawbacks). Also, from my experience I have not seen any (at all) benefits in any way (SEO or loading speed) from having absolutes, rather than relatives.
Oooooh man, oh man. Hi there
The same question I ask myself everyday - Should I join the guild of spammers?
Well, during such moments, my brain, thankfully, still works in a "reasonable", not "do-whatever-you-want" mode
To your question - don't do it. I know it's tempting, but it's not a long term solution, it might be the short-term, but what's the point if you aren't planning to stay in business for a while? Just keep doing good quality white hat, maybe invest more into offsite SEO and branding and results will come. As an example - it took us year of constant work to rank in top 5 for "seo houston" and "web design houston" from nothing. But we are on first page now, who knows where we'd be if we were doing black hat - there were so many major hits by Google in 2015.
So, just keep swimming, just keep swimming, my friend.
Hi there.
There are lots of topics on MOZ about this issue. Have you looked at all of those and made sure that none of problems, listed in those threads are not applicable?
Or should I fix the issue first via htaccess rule before attempting the migration
I quite honestly think that the problem is WITH htaccess, not that you have to fix something else with htaccess.
And as an answer to your question - you always can migrate with issues and hope that nothing breaks during the process, or try to patch it up so it seems to be working fine and, again, hope that it doesn't break on you, OR you can get it fixed at the root of the problem and don't worry about it in the future.
Hi there.
Well, don't put duplicates on every product page, that's for sure. The #1 option you have is very good idea. You say that you are afraid of users leaving product page and not coming back. Here is my idea:
Do option #1, but also dynamically "transfer" the product to that page. So, for instance you are on a product page domain.com/product1.php, when you click on a link about information (which is lets say domain.com/information.php), add a parameter to that link based on product page url you were coming from like so - domain.com/information.php?product1.
And then add extra section on information page with product details, possibility to add to cart etc, based on parameter. This way you can exclude urls with parameters from indexing (read here) or canonicalize all parameter pages to info page. This way you won't have any duplicate issues.
Cheers
Hi there.
It looks like some problems with htaccess rewrite rule or redirects. Most likely rewrite rule, since you say that it takes you to html page, not image.
Check that out.
Hi. Bad idea. It won't help with rankings. Only will bring confusion to users and, since it's kinda spammy and manipulative, wrath from engine crawlers.
Why do you want to redirect specifically? Why not just create a page domain.com/keyword, put related content, make it awesome, build backlinks to that page, if possible with exact or partial anchor text?
If you check Search Results, it's far-far-far not always that index page is ranking for every keyword.
Hope this helps.
Hi there.
So, you have all your links absolute? not relative? Gotta be painful to manage..
Well, anyway, to answer your question - the only bad part about not changing links to https would be that extra redirect. If your servers are good, fast and very reliable, nobody would probably even notice it. I would check loading speeds though, especially for mobiles.
Personally, I would change all links to relative and never worry about stuff like this. If you want to keep them absolute, then yes, I recommend changing them all. Just for clean conscious sake
About find-replace. That would depend on how your website is built. I assume you're talking about wordpress? Then yes, you should be able to. As long as you know where to search.
Cheers.
Hi there.
It's recommended that you have only one H1 and H2 per page. It won't "break" your website or SEO, but it won't help it either if you have multiple same level h-tags. So, here you go - don't use h-tags in nav. It's weird anyway
Hi there.
I haven't dealt with such problem before, but here is an idea - what if you "user upload" new nice images, which will get +1s and shares - in theory this should push unwanted images down, out of sight. Or at least out of first glance sight This might help a little while you wait for google to remove bad images.
Hope this helps
Hello, my friend.
Good question you got here. Made my brain leftovers to spin.
Anyway, here is my understanding (and I can be completely wrong).
As Google always says, don't treat Google bot differently from human users, also, they say don't play a/b testing on him. At the same time Google Analytics' a/b testing is working like this: when you visit page for the first time, you get "normal" page, than, based on chance of a/b testing, you either gonna stay or be redirected after loading the page (this one is important) to test page. After this you are being assign a cookie, so every recurring visit you are not "played" with testing until test is complete. Then all cookies are removed and everybody is served whatever version of a/b testing "won".
So, putting three hypothesizes above together, my understanding is that Google bot is being treated the same way - it gets "assigned" (or simply served the original) a version of the page on the first visit. This makes sure that there is no confusion by Google which version of tested page to index.
I think as long as you keep this in mind, there won't be any troubles for SEO.
Hope this makes sense and helps you.
Cheers.
Hi there.
What kind of keyword organizing are you talking about?
As far as I know there are no good tools for organizing keywords. I use google sheets - you can do quite a lot with them. Also you can write your own scripts. However, we are in process of building our own custom solution for in-house use. Because it will satisfy every single nuance i need. Quite honestly, it isn't that much work time if you plan carefully. So, I'd recommend talking to your developer team
I'm not sure if I understand your question correctly - correct me if i'm wrong.
Well, when i say competitor, i mean your competitor. MOZ doesn't have any topical relevance algorithms in calculating DA (as far as I know and understand). So, they do crawl all links (or as much as they can) on the Internet.
Now, as why you need to look at your competitor relative changes. Basically, as Rand said here:
My strongest suggestion if you ever have the concern/question "Why did my PA/DA drop?!" is to always compare against a set of competing sites/pages. If most of your competitors fell as well, it's more likely related to relative scaling or crawl biasing issues, not to anything you've done. Remember that DA/PA are relative metrics, not absolute! That means you can be improving links and rankings and STILL see a falling DA score, but, due to how DA is scaled, the score in aggregate may be better predictive of Google's rankings.
If 100 DA websites got more links, then all other websites' DAs will drop proportinally. Therefore, if your website's DA dropped even more, something is wrong with your backlink profile.
It's quite weird, I know. I'm not sure if I can explain any better. Hope this helps.
P.S. I'll send this post link to Rand, maybe he can clarify more.
Hi. Exact opposite, my friend. When highest DA domains get a lot more links, it affects every other domain (unless your domains are 100DA). Basically, what you need to look at is relative change of tracked domains' DA to each other. If your domain went down 2 points and your competitor's went down 5 points - good for you. if opposite - not so good for you
I understand that it's quite confusing and i i don't really understand why MOZ sticks to relative DA scoring, but that's what we have to work with.
Hope this helps.
Hi there.
Also, if you Google your company name and knowledge graph comes up, there is a direct link to write review and direct link to a page from which customers could write review.
Hi. I wouldn't use "noindex", so images are actually getting into Google's image search etc, but canonical sounds fine.
Hi.
Is it ok? - Sure. Is it helpful - that's the question. From my experience - if it's not newsworthy and people(news portals) won't pick it up on their own - it's quite pointless.
Read these Q&As:
https://moz.com/community/q/using-press-release-for-promotion-prweb
https://moz.com/community/q/pr-web-vs-pr-newswire-which-is-best-for-pr-optimization-distribution
This is what so frustrating, ridiculous and weird is - Google doesn't really know what exactly going on in their system. Thanks to multi-layered project management, machine learning etc.
If you are creating filter just for one IPv6, why to you use regular expression at all? Just input it as it is.
Hi there.
Well, 3 spots drop in rankings is surely not "quite dramatic drop". It can happen for many-many reasons. What I would recommend though is to check if the domain name you have purchased ever had (or still has) penalty or spam issues. Usually, if everything is clean, it wouldn't affect your rankings.
As for irreversibility - nothing is irreversable. However, if that domain was in fact the reason of ranking drop due to spam issues, then it might take longer to recover.
Yes, disabling would be the first thing to try if you think that was the reason. See how your rankings react, if there are any sudden changes. However, I do recommend to check other factors like your backlink profile changes, content cnahges, your competition SEO changes and so on.
Hope this helps.
Hi, Greg.
I still think that the explanation roots in the relativity idea I have quoted above. Also I linked to original post by Rand, so, check it out - you might find all the answers
Hi there.
It seems to be that there is something wrong with javascript. Because it seems like piece of JS code. However, even if i remove void part, the page still doesn't exist. You sure it's just "void(0)" problem?
Hi there.
Why do you think it would be a problem? I assume you want users to see that popup, as well as you don't want to hide anything from google. So, as long as everything is clear both to search engines and users, I don't think there is a problem.
Hmmm..
Well, I would make sure to work on on-page SEO and use "fetch as google" in GWT after any work is done to ensure crawling of all changes.
Other than that - I'm not even sure what to say. Check periodically for tech issues and other errors and make sure that all redirects are properly working.
Hi there.
Phew! After reading all that text I'm still not sure what the exact issue is. Correct me if i'm wrong. Basically, after redesign and restructure website doesn't rank, even though there are no technical issues?
Now, I understand that tech issues can be the reason for not ranking, but what about actual on-page SEO, is everything else perfect?
Hello, my friend.
Well, I have done both ways on different websites and I've seen both ways work fine. However, just to cut time on adding schema and potential problems (who knows how google gonna treat duplicate schema tomorrow), I tend to add local business markup to header OR footer and use contact page for contactPoint markup. This has been working swell for me.
Hope this helps.
Hi there.
Please refer to this topic by Rand: https://moz.com/community/q/da-pa-fluctuations-how-to-interpret-apply-understand-these-ml-based-scores
Read #2:
You've earned more links, but the highest authority sites have grown their link profile even more
Since Domain and Page Authority are on a 100-page scale, the very top of that represents the most link-rich sites and pages, and nearly every index, it's harder and harder to get these high scores and sites, on average, that aren't growing their link profiles substantively will see PA/DA drops. This is because of the scaling process - if Facebook.com (currently with a DA of 100) grows its link profile massively, that becomes the new DA 100, and it will be harder for other sites that aren't growing quality links as fast to get from 99 to 100 or even from 89 to 90. This is true across the scale of DA/PA, and makes it critical to measure a site's DA and a page's PA against the competition, not just trended against itself. You could earn loads of great links, and still see a DA drop due to these scaling types of features. Always compare against similar sites and pages to get the best sense of relative performance, since DA/PA are relative, not absolute scores.
Basically, DA/PA is relative metric, it's relative to 100 DA websites, so, if let's say facebook got +million links, and you got only +10, then relatively to FB your website DA will drop. That's why so many websites' DAs have dropped after last crawl.
To see how good/bad your DA changed, compare to your competitors.
Hope this helps.
It worked for me. I just checked. Are you doing it in custom, rather than predefined?
As far as I know - no, they dont' support it yet (weird, I know). It does say it here: https://support.google.com/webmasters/answer/83106?hl=en
Note: The tool does not currently support the following kinds of site moves: subdomain name changes, protocol changes (from HTTP to HTTPS), or path-only changes.
It should be
2606:a000:4e2b:d500:61d9:9d4e:c539:ea86|174.99.90.148
Again, is it working if you create a filter just for ipv6?
Hi there.
Well, I think you actually have two questions:
Well, as for q1 - the only way in your situation is to do "fetch as google" in GWT of your website's page when you just posted new job. Usually it takes just couple of minutes for Google to index it and include in SERPs. So, first make sure your website is indexed, only then start posting the same job on other job search websites.
As for q2 - that's completely different story. Even if Google understands that your website was original author, it doesn't mean whatsoever that your website is going to outrank large job websites. To make that happen, you have to try to get as many backlinks as possible to those job postings (maybe from other websites with "click for full job description"), and more backlinks to your domain in general.
Hope this helps.
Hi there.
Well, in the same article you are referring to, is this text:
Amazon use to use a lot of tabs but now they seem to output most of the content directly on the page, making the user scroll and scroll to see the content. _Google's own help documents does use click to expand but only to see the questions. _
Also there was this video from Matt: https://www.youtube.com/watch?v=UpK1VGJN4XY
I understand that a lot of this content contradicts each other etc, but I'd look at this problem like this: it's not a secret at all that Google puts (or at least states that they put) User Experience first. So, Look at your page and see if users, after they land on it, would be happy. If everything makes sense from User point of view. If "expand" buttons are large enough and portrait that by clicking on them you'd expand content etc.
Also, as Matt said, is there 8 pages of content hidden and being displayed after you click "expand" and ruining your day?
I believe that as long as it looks good, makes sense to user and is good content, there shouldn't be any problems. The only workaround i see is instead of expandable content, to have simply links to other pages. I've seen both scenarios work.
Hope this helps.