Here is the article I was referring to: https://support.google.com/webmasters/answer/1269119?hl=en
Posts made by Millermore
-
RE: Cleaning WP theme 404s in GSC
-
RE: Cleaning WP theme 404s in GSC
I should've said the "Remove URLs" tool instead of the Disavow Tool. Yes, Disavow Tool is to disavow incoming links that you don't want. The Remove URL tool is to remove content from Google, but I went through their little page about how to use the Remove URL tool and it says don't use it to get rid of content that doesn't exist anymore, and that Google will naturally find it. Well, how long does that take? Months? And what happens if I do use it? Ugh, this is very annoying as it is affecting a lot of my websites, and I don't know how much of an impact these Crawl Errors actually have on my site. Again, I understand the value of links that people are actually linking to, but this is more like hidden content that Google found, which I've gotten rid of, but they're still looking for it. Any help is appreciated.
-
RE: Cleaning WP theme 404s in GSC
The pages exist, but they are unpublished drafts, not accessible to the public. I have marked them as fixed and they keep popping up.
I've checked the site and I'm not linking to them on any of the pages that are live. It just seems like before I marked them as drafts, Google spotted them and is still looking for them. They were never in any sitemap I've submitted before, so I'm confused by this. I've also opened up a thread in the past regarding why some 404 crawl errors come up for desktop, and why different ones come up under Smartphone.
-
Cleaning WP theme 404s in GSC
I'm trying to clean all of the Crawl Errors for my sites, and I've reached the point where I've become slightly confused. A lot of these pages that come up in Crawl Errors aren't being linked to anywhere. The ones I'm referring to are mostly pages that came with a theme that I'm using - part of the demo content - which I've since set to Unpublished Drafts. I'm not linking to these pages anywhere on any of my Published pages, yet Google is still looking for them, still showing them in Crawl Errors as Not Found.
I'm assuming that Google found these pages at some point and can't find them now. I'm not sure if I'm supposed to keep setting up 301 redirects for these, or should I use the Disavow tool for these pages? I want to tell Google to forget these pages completely because I never intended for these pages to be indexed.
This happens for just about all of my Wordpress websites in Google Search Console. Can someone please shed some light on this? If there are any articles on this problem, please share! Thanks!
-
RE: URL Errors for SmartPhone in Google Search Console/Webmaster Tools
Looks exactly the same on my phone as it does on desktop. The pages coming up as 404s in GSC under Smartphone are NOT listed on this page-sitemap.xml page.
-
RE: URL Errors for SmartPhone in Google Search Console/Webmaster Tools
I am using Yoast but I am only using the Page sitemap, and it is the only one I have submitted for the affected sites. Again, this doesn't really explain why it's coming up under Smartphone and not Desktop. Also, Google does tell you where these are linking from, and it does say the sitemap page, but when i looked at the sitemap page, these pages are not listed. I am looking at them on my desktop though and not my smartphone. If I looked at the /page-sitemap.xml page on my phone would it look any different? :::quickly picks up his phone and tests it out:::
-
RE: How long will this Site be punished? (place your bets!)
It might be worth it to switch domain names at the end of the day, depending on how important it is to you.
Fortunately, I've never had to do it, but I've read a million times that after you disavow links and try to get rid of bad links the best you can, at some point, after you've literally attempted to get rid of each and every link that has the potential of causing you issues, you have this option to write a "reconsideration request" to Google.
Read this guide, it's great, and the reconsideration request is mentioned in Step 7: https://moz.com/blog/ultimate-guide-to-google-penalty-removal
Best of luck! Let us know if anything changes in the future!
-
RE: How long will this Site be punished? (place your bets!)
I don't think anyone can give you a real answer to that question. Any answer would be speculation. There's also not enough info here.
Have you gone through Google Search Console and looked under Manual Actions? How does the rest of GSC look in terms of errors and such?
I would try using the Moz Span Analysis tool and see if you missed any other bad links.
It sounds like it could be possible that Google is manually banning you from the first page. I don't know if that is a real thing or if they actually do that, but I suppose that could be possible! Have you tried sending a message to Google explaining all the steps you have taken to try to remove all spam?
-
RE: URL Errors for SmartPhone in Google Search Console/Webmaster Tools
I'm seeing similar issues. I was going to post a question, but found this when I searched.
I'm using Wordpress and I have some theme pages that I have set as drafts, so that I can access them on my end, but the public would get a 404. In Google Search Console, under Desktop, none of these pages are coming up. But under Smartphone, somehow Google is finding these unpublished draft URLs as 404 Not Found errors.
My question is why is Google seeing these pages, and also why it's triggering on Smartphone only and not desktop. And my last question is the obvious one: how do I fix this!?
Thanks!
-
RE: Net Neutrality: FCC Votes To Make Internet Public Utility
Here is Mashable's article on what's next: http://mashable.com/2015/02/27/net-neutrality-whats-next/?utm_cid=hp-hh-pri
-
RE: What is the impact of HTTP/2 on SEO ?
I don't think they're coming out with HTTP/2 as another option vs HTTP, each with their separate pros and cons. I don't look at it as HTTP vs HTTPS, for example, but rather HTTP 2.0 - the next generation of HTTP. There may be short term, temporary issues as it rolls out. There may be hosts that take a while to get on board. But aside from that, I am hearing that it will seriously speed up load times as it will be able to load multiple elements at a time simultaneously, as well as an increase in security, which is much needed in today's day and age.
-
RE: Net Neutrality: FCC Votes To Make Internet Public Utility
Here are a few articles I found on today's events. Please keep in mind the source and their respective bias.
HuffPost: http://www.huffingtonpost.com/2015/02/26/net-neutrality-fcc-vote_n_6761702.html
Reuters: http://www.reuters.com/article/2015/02/26/us-usa-internet-neutrality-idUSKBN0LU0CA20150226
BBC: http://www.bbc.com/news/technology-31638528
Here is an excerpt from the BBC article:
"The main changes for broadband providers are as follows:
- Broadband access is being reclassified as a telecommunications service, meaning it will be subject to much heavier regulation
- Broadband providers cannot block or speed up connections for a fee
- Internet providers cannot strike deals with content firms, known as paid prioritisation, for smoother delivery of traffic to consumers
- Interconnection deals, where content companies pay broadband providers to connect to their networks, will also be regulated
- Firms which feel that unjust fees have been levied can complain to the FCC. Each one will be dealt with on a case by case basis
- All of the rules will also apply to mobile providers as well as fixed line providers
- The FCC won't apply some sections of the new rules, including price controls"
-
Net Neutrality: FCC Votes To Make Internet Public Utility
It sounds like it is now official: the FCC has voted to make the Internet a Public Utility, supporting Net Neutrality. But before I jump for joy, I'm asking myself, "What exactly does this mean?"
I know what it doesn't mean: ISP's won't be able to throttle data, and they won't be able to package together access to websites for additional fees, as they do with television channel packages. That's great, in my opinion. Even though I'm a Libertarian and I believe strongly in freedom, I know that that would have a seriously negative impact on the Internet, and especially for people like us who rely on it on a daily basis as our livelihoods.
The problem that I am find is that what they voted on contains ~322 pages of new regulations for the Internet. I have no idea what is in those 322 pages, and I doubt anyone who voted on it does either. The democrats are loving it, while the republicans are calling it, "Obamacare for the Internet". My mind works more in the direction of, there must be pros and cons. I'm just very curious as to what those pros and cons are, and what this will actually mean for us in the online marketing industry, as well as anyone who works on the Internet.
I'm not looking for any answers, and I'm especially not looking for a political or biased debate. But I think there should be a place where we can discuss this issue, because it has the potential to be extremely important to us.
Please share you thoughts, findings, and research here where we can discuss them. I'm looking forward to learning from you all, and I hope I can add some useful insights to this conversation.
Once more: please, do not turn this into a political debate - this is not the place for it. Please keep it to how Net Neutrality and Internet as a Public Utility will affect the Internet and Online Marketing landscape for us, our clients, and our customers.
-
RE: Optimal SSL Solution?
I probably had at least 5 more phone calls with people about SSL yesterday after I posted here. Ends up WP Engine overestimated the time, and they think it'll add less than half a second which I can deal with. I also learned that the certificate won't really affect speed.
I decided to allow my clients to choose which route they want to go with, a $50 SV from GeoTrust, a $300 EV from GeoTrust, or a $1000 EV from Symantec. I explained to them the main difference between the $300 vs $1000 option is really just the name, and how visitors trust the brands.
My next dilemma is whether or not to go 100% HTTPS. I am leaning towards it - I just don't know if it's overkill or not. I'm assuming 100% is the long-term ideal route. If, for example, I have an e-mail opt-in in the footer or sidebar of all my pages, I guess it'd be best to secure all of the pages then. I'm assuming the only negative is a possible reduction in load time?
Thanks a lot for the awesome responses. I really feel like I'm getting a much clearer picture of this whole situation.
-
RE: Optimal SSL Solution?
Thank you very much for that response, and for the links. I just finished reading through that Moz article in it's entirety. Between that and your response, I agree that the EV sounds like the ideal certificate to go with. Now here are my new questions:
Now I want to get an EV Certificate and I want to go 100% HTTPS site-wide, as these seem to be the most ideal combination, especially long-term for SEO. I just finished a long phone call with WP Engine regarding this situation. While they resell RapidSSL Certificates, they do not offer EVs. I can still buy one from someone else and bring it over though. When I told him I want to go 100% HTTPS, they told me they can help me but it will add an increased load time because the site needs to make the handshake everytime a page loads. Right now my site loads at .1 seconds (WP Engine rocks!) and I don't really care if it doubles with 100% HTTPS. However, he told me it will add 1-2 seconds. That sounds like a lot to me. We went back and forth on this alot, but he stuck to it that generally it will add 1-2 seconds.
I noticed that Symantec sells $1500/yr SSL Certificates. The higher up you go in the tiers, the faster the speed, at least they claim. In this case, does this mean if I want my site to still load at .1 seconds with 100% HTTPS I have to pay $1500/yr? If so, that leads me to this question: how is it possible that Google wants your site to load under .25 seconds, yet they want your site to be 100% HTTPS? I mean, if I have to pay $1500, it is what it is, and that will definitely separate the big boys from everyone else - and my clients might be fine with that. But something else is leading me to believe that something in this overall equation is not right - I must be missing something or have something wrong here.
-
Optimal SSL Solution?
I am in the process of moving all of my client's websites to HTTPS. I have a client with an SSL certificate through GoDaddy for an e-commerce site, and my host WP Engine offers them for $50/year each. This has been fine, but now I am trying to move about a dozen sites over and I'm just trying to figure out the best, most ideal way possible to do this. I could just go through WP Engine and pay them for the certificates, but after doing research on different SSL providers, I've totally confused myself. I have seen a wide range of prices for certificates, but I can't tell if it's just BS or there is actual value. I'm talking about a $10 certificate vs a $250 certificate through Symantec.
Aside from that, I have found a few different types of certificates: single domain certificates, wild-card certificates for subdomains, and a multi-domain certificate. I would love to buy one multi-domain certificate that covers all of my websites - but I'm not sure what the pros and cons are of doing this, specifically in regards to SEO. Can anyone explain what the pros and cons are for these in my specific situation?
I'd love to hear any recommendations for my situation, and if there is something else I am missing that is important, please share!
-
RE: Not able to access Moz pro page and my campaigns
I thought it was just me, but looks like everyone is being affected by the redirect loop.
-
Domain Extensions
I wanted to get everyone's thoughts on the new domain extensions that are now available. I'm considering buying a couple .lawyer and .attorney domains for clients. I noticed when I tried to buy these I was asked for verification if we're going to be offering legal services through the site. That led me to think that it may be possible in the future that with this verification, if it's required, that means that not just anybody can have these domain names. That leads me to think that it's possible that these domains may benefit from users searching for terms with "lawyer" or "attorney" in their search term. I haven't seen anything in terms of these domain extensions and SEO yet, but I'd like to know your thoughts as to how these will be treated in the future. I can imagine these will be more valuable than the old .net, .us, .info, etc., domains.
-
RE: Scheduled Custom Reports Not Running
I reported similar issues with many of my reports. They told me they are dealing with DDOS attacks which are affecting the reports.
-
RE: Analytics not tracking traffic from Old Domain Redirect
It should still record the data in there as long as you are tracking the new domain. The traffic from the old domain will come up as referral traffic. These may be considered self-referrals, which you may not be tracking. Here is an article with more information: https://support.google.com/analytics/answer/3198398?hl=en-GB (I gotta give credit to Gary Lee for that link from this post http://moz.com/community/q/will-301-redirects-same-domain-show-as-referral-traffic-in-analytics)
Were you tracking the traffic from the old domain? Do you know how much traffic that site was getting on a daily basis?Or are you just assuming it was getting traffic and therefore you should expect an increase in traffic?
-
RE: How much is the effect of redirecting an old URL to another URL under a new domain?
It's hard to say "how much" but it will be important as it will result in duplicate content if you don't redirect the page. If you're asking how much link juice it'd be worth, it'd depend on the authority of the page.
Here are some articles to read:
- http://moz.com/blog/expectations-and-best-practices-for-moving-to-or-launching-a-new-domain
- http://moz.com/blog/seo-guide-how-to-properly-move-domains
- https://yoast.com/move-wordpress-blog-domain-10-steps/
- http://www.wpbeginner.com/wp-tutorials/how-to-properly-move-wordpress-to-a-new-domain-without-losing-seo/
-
RE: How/why is this page allowed to get away with this?
I guess you're right, but does that mean that Google wouldn't consider this a black-hat technique just because the link juice is divided by so many links? I thought it would actually be the opposite, that having only 5 or 10 links passing juice on a page would be okay, but something like 600 would be considered spam. I don't know, but perhaps Matt Cutts has said something about this specifically.
Regardless, have you, or has anyone here heard the phrase, "If your intention is to gain rankings in Google, then it's black-hat" Basically anything you do, such as listing a bunch of links like this without a nofollow link and asking to trade links, based on what I've gathered from Matt Cutts, is considered black-hat. If I'm wrong, please let me know.
But let's assume that everything you're saying is correct. How can we make the most of this situation? For me, for example, I actually went to Open Site Explorer and checked into followed external links and sorted them based on Page Authority. This was actually the most powerful link going to the site (I believe I was researching Quirky.com) based on what Moz was telling me. If what you're saying is true, then shouldn't Moz's algorithm be updated to take into consideration the amount of links on that page, then perhaps also they can take that and divide it with the Page Authority for the page and give us a new number based on that? That would probably be a much more accurate way of ranking pages based on how powerful they are, or how much link juice is going to them. Maybe there's a way to do that now and I'm just not aware. Do you have any strategies you use for this sort of thing, dividing link juice between the number of pages on the site?
-
RE: Error reports showing pages that don't exist on website
I would avoid using the "remove URL" option in GWT. The 301s are more ideal in my opinion because let's say I have that old URL posted on my website somewhere, and now it's going to a 404 page. When you redirect it, people will be taken to a different page, and you don't have to worry about having me update the old URL on my website. The link will work, it will take you to an active page and can get you some traffic. However, the "remove URL" option won't give you this same benefit.
Here's a helpful link straight from the source on when NOT to use the Remove URL option: https://support.google.com/webmasters/answer/1269119?hl=en
-
RE: How/why is this page allowed to get away with this?
Cyto,
Thank you again for another great response. You haven't put me off, quite the contrary. I really enjoy discussions like this because I actually work alone, as a one-man-show, and I don't get the opportunity to discuss SEO or online marketing with anyone really, let alone any experts. So personally, I rely a lot on Matt Cutts, and the info I get here at Moz, and other similar sites that I subscribe to on my RSS feed. Of course I also have a Pro account here at Moz and use it a lot for all of my clients.
I personally feel like Matt Cutts is the only person who knows what they're talking about, and the only person to trust. However, I have heard an SEO say before, "The things Matt Cutts may say is nice and all, but I rely more on the results that I actually find rather than just do what he says blindly." That makes sense, but I feel like that person was referring to doing black-hat stuff, until he gets caught. Regardless, my trust is still with Matt Cutts.
You said in your post (and it may have been a typo, I don't know) this: "My gut feeling is that, Google won't penalise a website who is an internet company made in NYC and listed on a non-profit organization website with a nofollow link. It seems like a natural fit." If, in fact, all of that was true, and they were using noFollow links, I believe this entire discussion is rather pointless because just through using the noFollow links on that page instead of 100% followed links, they would be in the clear as far as I'm concerned. I don't think there is any issues with trading links, anchor text, reciprocal links, etc as long as they are no followed. But in this case they are not no followed. They are all followed links. And they are asking for anchor-text optimized followed links. This is the key for me.
Now you may say to me, hey, it's an internal page with a directory and it's a non-profit .org site. Users may actually gain from this. However, Matt Cutts has said that any time you are doing things for the purpose of gaining rankings, it is considered black-hat. They can have that directory, and keep all the links, and provide this unique benefit to their users. However, in 2013 and 2014 and beyond, Matt Cutts has said this sort of example should have nofollow links because it won't change anything at all for the user experience, but it WILL cut down on spam because people would really be attracted to that page because of the linking opportunity. If the links were nofollow links, I doubt there would be nearly as many people excited about getting on that page. To me, this page is primarily for SEO purposes, in that the page will gain back links from the people who want to be listed, and the user experience is actually secondary. I have gained from Google that the user experience should be first, and the way to do that would be to nofollow all the links.
I am also aware that the algorithm doesn't necessarily take individual pages into account, but rather groups of pages with similar issues. For example, a page with massive links with optimized anchor text from PR1 or PR2 sites will be penalized, as we've seen from past updates. Other things like a text/html ratio should be above at least 15% from what I've seen, and the maximum amount of links per page shouldn't typically be higher than 200. This goes against all that. The craziest part about all of this is that I would expect this page to be somewhere around PR3. But it's PR7. WHY. That is the question. Are they being penalized, and just overpowering the penalization to get there? Has Google in fact placed this website or page on some sort of "white-list" that isn't included in typical algorithm roundups?
I'm actually to the point where I think I'm going to send Matt Cutts and e-mail and let's see if he responds. In the mean time, I would love to keep this discussion going! Cyto, I would love to hear another response, and if anyone else has anything else to add, or any other thoughts or theories (OR EXPERIENCE WITH THIS EXACT SORT OF THING) kindly add to this discussion! Thanks!
-
RE: Error reports showing pages that don't exist on website
Rel=canonical is used more when you have duplicate content. If you have the same post or page in two areas, you can use the rel=canonical tag to tell Google where the original of the duplicate is. It sounds like you don't need rel=canonical in this situation.
It sounds like you have 80-something 404 Page Not Found errors. I would use the "Redirection" plugin with Wordpress. Take each URL that is giving you the 404 error in your report, and redirect each one to the most relevant page associated with what was supposed to be on the page that is giving the 404 error. If there really is no relevant page at all, I would just redirect it to the homepage. In my opinion, it's better to have it redirect to the homepage than to have the user land on a 404 page. I would do that for every 404 error you are getting. Doing this, I don't think you'll need rel=canonical at all.
-
RE: How/why is this page allowed to get away with this?
Cyto,
I like your thinking on this one. This is where I was trying to go with it. But still, you asked many of the same questions that I asked. I realize we won't have a solid answer unless Matt Cutts himself speaks on this specific issue. However, I'm still left with unanswered questions. Here's a few points that are left standing:
- I realize there are billions, if not trillions, of websites and pages in existence. However, there are not billions of pages who are at a PageRank of 7. You can try to disregard their PageRank and tell me how it's going to be deprecated soon, or it's not accurate, or whatever. But regardless, they got that page to a PR7. If you think that doesn't matter, and it's not important, I'd like to see you try to get your page to PR7 and tell me how long it took you to do that. What I'm saying is I don't think they magically got to PR7 overnight, and I don't think that Google has missed this site. There's only so many PR9, PR8, and PR7's out there. What are the chances that they completely missed AND messed up on the PageRank for this site? The only other explanation I have for the PageRank is that they were white-hat for a long time, and then when they got to PR7, they flipped to this black-hat type of page. But I doubt that's the case. They're either still benefitting from black-hat techniques, OR we are misjudging this site and Google actually does think it deserves a PR7.
- Try thinking about it like this: yes, this page is practicing many things that are straight-up black-hat, things that Matt Cutts has publicly and openly said is considered spam. Just simple things like a text/html ratio, or a certain number of links per page, or asking to trade links, or having massive links without nofollowing any of them. What if Google saw this page and said, wow this is a black-hat page, let's penalize them. And let's assume that this page is penalized. But what if all the sites on there are linking back to this page, and therefore all the link juice from the other pages pointing back to this page is basically that much more powerful than the penalization that it's basically overpowering the penalization with more back links, thus bringing them to a net PR7? The question here is: can you overpower Google's penalization with more bad back links?
- Looking deeper into the whole .org/non-profit/maybe Google likes these types of pages, perhaps they do and we're all just wrongly assuming things. In this case, I agree with Cyto, this page could be unique and it does benefit the user. However, isn't this the exact scenario that Matt Cutts has told us to implement a noFollow tag? I believe he has said repeatedly, if you must link to another site and you're not sure about it, just put on a noFollow tag. If you have reciprocal links, no need to get rid of them, just simply nofollow the links. It's this sort of thing that is giving me trouble fully accepting that this is a good page and Google likes them. And IF Googles does like this page, and the PR7 is deserved, and the followed links are fine, then I SHOULD try to get my client a link on this page. But I suppose there is a risk because we won't 100% know for sure unless Matt Cutts says so.
- Diving deeper into the "Google may like this sort of page" for the reasons you stated, it sort of contradicts what has already been said from Matt Cutts. For example, if I put a link in a press release back to my homepage, there is some value in that link to the user because it makes it easier for the user to visit that page simply by clicking instead of typing in the URL. In this case, all PR links have been nofollowed across the internet. You can use this same excuse to use a link, and say it creates value, but Google is telling us to noFollow these links. Especially when talking about a "directory" specifically, I have read that Google is shutting these sites down completely. However, we are left wondering if this specific site is on some sort of "white-list". In that case, the first-person to create a "directory of white-listed directories of followed links" I'm sure will be quite successful with that page.
- What is stopping me from creating a .org page similar to this? Why can't I build a page up to PR7 and openly exchange links with people? The biggest thing stopping me from even thinking about something like that is because I am assuming this only worked 5+ years ago. Regardless, I have a client who sells a few unique products, and one of their competitors is Quirky.com who led me to finding this page because they have a back link from this page. The problem I'm seeing is that Quirky.com is benefitting from a link on this page, and I'm worried about joining it due to a potential penalization. In this case, Quirky doesn't really have to worry about anything because they have so many links, and they're established. But if I wanted to get the same link as them, I have to worry. This is the sort of thing that makes it hard to compete with the big players. Not that I think this client is on par with them, but I just get the feeling that they're allowed to do more than we are. Perhaps I'm wrong, but it's the feeling I have.
- It's getting harder and harder for me to find white-hat followed link opportunities. It seems like everywhere I go, the link is going to be nofollowed. Other people's websites, they want to noFollow the link. Guest blog posts, they want to noFollow the link. Press releases are all nofollowed now. The case is either the link is noFollowed, or you risk penalization on a followed link. This is the corner I feel I'm getting pushed into.
- I learned a while back from an SEO that links are the most powerful form of currency in the SEO world. A link is the number one most powerful way to get higher up in the rankings, for the reason that it is basically a sign of saying "this site is trustworthy and worthwhile to check out" and Google puts those things together to say they are worthy of higher rankings. And it all makes sense to me, and I haven't seen anything to tell me otherwise. If I'm wrong and I missed something, let me know. I mean, it's great to put out unique content and all that, but what is the point of the guest post or the press release if there's no indication that you wrote it or that it has anything to do with your site? What is it worth at that point if there is no link included? I understand the organic side that some people may literally read it and visit your site off that, but that's an inefficient way of doing things. I'm down with "link-earning" but only if I can actually earn a followed link. What's the point of a link-earning process if you don't earn the link, know what I mean? It just seems like everything is going this way of noFollowing links, or you have to worry about a penalization. And before you say it, I am aware that it's less than 20% of all links that are noFollowed, but still, this is the feeling I'm getting. (That number may be higher now that all Press Release links are no followed, not sure)
- I'm really not trying to do anything black-hat. I'm trying to do white-hat stuff here, but with the purpose of accelerating my client's process of getting higher in rankings. Listen, I'm doing all the other stuff well, it's just this whole link-building/earning aspect is tough and it seems like 2014 is going to be much harder than previous years.
What are your thoughts on these points?
-
RE: Error reports showing pages that don't exist on website
For me personally, on Wordpress I use the Yoast SEO tool and I went through the tutorial on the Yoast website. He shows you how to eliminate a lot of the duplicate content that automatically gets created with all Wordpress websites. Once you noindex and get rid of all the unnecessary archives and all that, at that point I would recommend going back to the error report and see the difference and see if those pages keep coming up. If they do, just simply 301 redirect them to another page on your website. Then check again after you redirect them and see what you're left with. Sometimes it takes a couple weeks to reflect from what I've seen. Not sure if this is the exact issue you're having, or if you're even using Wordpress at all, but it sounds like if you are this might help you as it helped me get my errors down to zero.
-
RE: How/why is this page allowed to get away with this?
So what I've gathered so far is that you're saying that indeed this page is considered "black-hat" for the reasons mentioned above, and that eventually the site, as well as possibly the sites that are listed on this page, could all receive a penalty, and that I should stick to white-hat strategies.
But let's take this a step further. We are simply assuming it is "black-hat" page based on things we've heard and have accepted as fact. However, what if Google and Matt Cutts actually see this page as something different and actually perhaps "white-hat"? I may ask then, how could they see it as white-hat if it's breaking all these other rules? At that point I would look at something like the Yahoo directory. The one where you pay $300 per year to get one backlink. I feel like these two sites are fairly similar and breaking similar rules.
So for some reason, Google likes the Yahoo directory and lets them do what they want. Perhaps they are also putting this webpage in the same boat as the Yahoo directory. Matt Cutts' excuse may be something like, "Well, this page has been around for a while, and it's actually quite a unique page in that there are no other directories like this on the internet, and therefore is actually providing a benefit to the user in the form of a directory of startups in New York." But then I would ask, why not nofollow all the links, because isn't that the whole point of the nofollow tag?
He also may say something like, they have strict guidelines and rules to allow people into the directory. After all, it's not open to anyone, only startups in NYC who get >10k visitors per month to your site. But still, why not the nofollow links? They are also blatantly asking to trade links. And after all that they still have a PR7?
The next step I ask is, should I try to get my client on this list to, so that he may benefit from the directory and the PR7 backlink? I'm thinking twice about it because I don't want to wake up one day and be penalized because I have a link on that directory, and I have a reciprocal link going back to that site. If you ask me, this page is a perfect example of what Google doesn't want, and yet Google is rewarding them. So I'm not sure if perhaps I'm the one who is wrong and perhaps Google actually likes this site because it may be unique to some extent.
I would love to hear some sort of official response on this from someone at Moz, as well as any other people here who are familiar with this sort of situation, and any success on pages like this that we are assuming are black-hat. I would love to have someone from Moz actually visit that page and give me their analysis on if the page is breaking any rules, and why it has the PageRank it does.
-
How/why is this page allowed to get away with this?
I was doing some research on a competitor's backlinks in Open Site Explorer and I noticed that their most powerful link was coming from this page: http://nytm.org/made-in-nyc. I visited that page and found that this page, carrying a PageRank of 7, is just a long list of followed links. That's literally all that's on the entire page - 618 links. Zero nofollow tags. PR7. On top of that, there's a link at the top right corner that says "Want to Join?" which shows requirements to get your link on that page. One of these is to create a reciprocal link from your site back to theirs.
I'm one of those white-hat SEOs who actually listens to Matt Cutts, and the more recent stuff from Moz. This entire page basically goes against everything I've been reading over the past couple years about how reciprocal links are bad, and if you're gonna do it, use a nofollow tag. I've read that pages, or directories, such as these are being penalized by Google, and possible the websites with links to the page could be penalized as well. I've read that exact websites such as these are getting deindexed by the bunches over the past couple years.
My real question is how is this page allowed to get away with this? And how are they rewarded with such high PageRank? There's zero content aside from 618 links, all followed. Is this just a case of "Google just hasn't gotten around to finding and penalizing this site yet" or am I just naive enough to actually listen and believe anything that comes out of Matt Cutts videos?
-
RE: Rankings Bouncing Weekly
Thanks Cyrus. I hear what you're saying, I'm just confused as to why Google would allow us on the first page for so many keywords every other week if we had so many bad links? My understanding was more that you have rankings, and you hold those rankings unless a competitor comes and overtakes you, or if you get penalized by Google. I guess this is technically the latter, but if I was getting penalized I wasn't expecting to be anywhere close to the first page for my targeted keywords. I suppose I will look into cleaning up the link profile. I am also planning on switching hosts to WP Engine from our HostGator VPS as it's just too slow. Perhaps changing hosts and speeding up the load time will help a bit as well.
I'll continue to tell the client to write more content. I personally don't do any black-hat stuff, especially link-building, so I'll have to go in and see what his old SEO people did.
-
RE: Rankings Bouncing Weekly
The website is nytrafficticket.com and as I said we'll have a solid amount of keywords in the top ten, then the next week they'll drop out of the top 40, sometimes past the top 50, then the next week they'll come back to the top ten. It seems as if the Moz rankings just aren't picking up those keywords for the weeks where they drop off then come back. But if it isn't that, then I can't figure it out. If this is normal, then I suppose the next question is how do you keep your rankings in the top ten consistently, every week.
-
RE: Help with homepage SEO please
Here's a video from Matt Cutts himself discussing how many links you should have on your page, and if there is a limit.
-
RE: Help with homepage SEO please
I took a look at the site, and you're right, there are a lot of links in that mega-menu. However, it looks like you can simply reduce the amount of links in the mega menu itself by restrategizing how you are structuring the links and pages. For example, if you go the the Franchise tab and look under Get In Touch there are 6 links right there you can get rid of as all 6 are going to the exact same page. You don't need 6 links like that going to that one page which some of them only have one line of text. I recommend going through all of those links and seeing which others are like this and how you can reformat the links themselves. It looks as if a designer decided how to structure it without thinking about SEO and wanted to put as many links as possible to make the mega-menu look more complete. I would explain to the client that the designer made a mistake and that you need to rethink the links.
-
RE: How can get inffo on all in one SEO plugin
I recommend checking out the Yoast Wordpress SEO plugin instead. You can visit this guide: http://yoast.com/articles/wordpress-seo/ to optimize the plugin which will really help your site out a lot.
-
Rankings Bouncing Weekly
I have a client who ranks well for a number of keywords. This week we have about 20 keywords in the top ten on Google that we're tracking. But every week it seems, the keywords bounce around quite a bit. This week, for example, we had at least 15 keywords out of the ~90 we're tracking jump between 30 to 40 spots. The most any keywords increased was 43, a few 42s and 41s, in one week. Next week, they'll go down a few, then bounce up again just like this. This has been happening for a while. I'm trying to figure out what the issue is.
-
RE: Hi guys, how can I access the on-demand Top Pages Reports? Thanks!
I'm not sure exactly what you're talking about, but perhaps it's in your Reports section. I don't know what you mean by "on-demand" or "Top Pages Reports" though? Do you mean what the top pages you're ranking for for certain keywords? Or top pages that are ranking for specific keyword terms? And do you man "on-demand" as in you don't need to wait a week to generate the report?
-
RE: Ranking keyword ecommerce product
I recommend you do some keyword research using the Google Keyword Tool or Planner and see what people are searching for. See how many people are searching for it with the space vs without the space. That should give you a good idea. You also need to see if you'll be able to rank without the space by checking the competition level, but also the keyword difficulty using Moz tools.
There may also be other alternative formats that people are searching for that you can take advantage of. For example, maybe there's a bunch of people that aren't using the last 3 numbers? Or maybe there are a lot of people who just type in the model number without "New Balance" in front of it. Doing good keyword research will help you decide what keywords to target.
You can also use Moz's SEO Beginner's Guide where they will show you all about keyword research and what to do afterwards. There is also a great post on the 3-tier keyword system, along with a video at Moz.com/academy. I highly suggest you check those out.
-
RE: Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
I totally understand what you're trying to do. What I'm trying to say is that they may be another way to get this location specific information to your users. Perhaps if you had one "sharepoint training" page, you can include all the locations there, with a schedule that changes if you hover or click on a location, but keeps you on the same page. This would likely be much safer with Google and would reduce the amount of work significantly. However, you may be losing potential SEO value without the individual pages for each location. Again, it's a balance, if you are able to create the pages without them being seen as duplicate content, then you're safe. If you can't make them unique, try to think about another method.
-
RE: Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
I would agree with the other two commenters here, you don't need to worry about duplicate meta descriptions, but each page needs to be unique to a certain extent. I'll try to add something different to this discussion: If we're talking to Google and Matt Cutts, and we're interested in white-hat only techniques, then I don't think he would suggest you create so many different pages if they aren't going to be very different. If you have many pages that aren't very different, than what value is that giving to the user? Or are you actually attempting to game Google (black-hat) by creating all these pages strictly for SEO purposes? If so, perhaps you should reevaluate your strategy.
However, if each and every location and topic is different and contains unique content such as completely different schedules and topic content, then I don't think you should have much to worry about. Just make sure that the actual content of each page is unique. Once you start creating dozens of duplicate pages, it may make more sense to try and figure out a simpler way to build out your site. You can try to balance and compare the risk of duplicate content to the benefit of having so many pages. Just focus on different content for each location and topic and you should be fine. In fact, Moz will tell you if you have duplicate content in your Crawl Diagnostics.
-
RE: The mystery of a SERP - Your Opinion is appreciated :)
I can see a few reasons: their Domain Authority and Page Authority are both higher than yours. It looks like both pages are properly targeting the keyword on-page, but perhaps you can remove the /page/ from the URL and see what that does? Make sure you redirect the old page to the new URL. Once you do that, try to get a few more quality inbound links to raise your DA and PA and you should be ranking higher than his page soon enough.
-
RE: Blog Swapping
I think the easiest answer to EGOL is that you would be missing out on a backlink. The whole question here, and thank you to Philip Crothers for asking it, is if this act would be positive or negative from Google's perspective. If Google is saying this is a reciprocal link and all reciprocal links are bad, then I understand. However, if it is truly unique and relevant, would Google be okay with it and could you actually benefit from the backlink? If Google is okay with it, and you will benefit from the backlink, then I think the easy answer is to include the link to your site.
Let's look at the example EGOL provided: For attribution you can say... "Philip Crothers is an expert on wedding day hairstyles and is the owner of WeddingDayHair.com" I'm assuming you are saying just use the URL as text, not a link. I understand what you're saying and I think that's a great idea. However, I think there is some value to the user in the form of convenience if in fact they want to visit WeddingDayHair.com, it would be easier to just click the link rather then copy and paste it into your browser. So I do think there is value in the difference. However, what the most ideal solution is will depend on if Google sees this act as a positive or negative one. I would love to hear an accurate answer on this specific situation.
-
RE: A few reciprocal links OK?
I can see both sides of this argument, however, I'd love to hear a solid answer. I have heard plenty of white-hat SEOs recommend asking family and friends to link to your site when working on building backlinks. As an online marketing and SEO guy for a company, from my perspective, I want backlinks to my site. I'm not going to do anything black-hat to get them, I want them as natural as possible. But reverse engineering this, looking at potential links you could acquire from people you know, then figuring out how to make it as natural as possible. For example, if I wrote a brand new, original and highly relevant guest blog post for a friend that includes a link that is passing juice to my site. If he sends me a guest blog post with a link to his site and we both post them, I am assuming this is a reciprocal link. However, it passes Google's "is it relevant, valuable, natural" test. This act of swapping exclusive, original, high-quality, and highly-relevant guest blog posts - is it considered white-hat or black-hat? Am I going to be penalized? Am I going to gain the benefit of this backlink pointing to my site? I would love to hear an accurate answer. Assume we are not using nofollow tags. Also, let me know what the case is if you do this once, 50 times, or 1000 times, what the difference would be. Thanks!
-
RE: Number of searches for specific keywords
Yes, but is there any way to use SEOMoz Pro for this? Specifically, I'd like to see my keywords listed with their rankings (which I can do now from the Rankings tab), and in the same list but in another column I'd like to see the number of local searches. When I tried out HubSpot, this was the case where I could see all three (and the ability to add more columns). Is there a way I can see my keywords, their rankings, and the number of searches in one place in SEOMoz Pro?
As you can imagine, it is difficult to evaluate your rankings accurately as you may be rankings first for a whole bunch of keywords, but they're not being searched. I'd like to easily see which "good" keywords I'm ranking for as opposed to the ones that aren't being searched. Currently, I have to have my rankings open in one window, and the GKT in another, and I have to type in my keywords to see their information. This should not be the case.
-
RE: What would you recommend i do to improve my rankings?
First, I would do keyword research to look for some long-tail keywords. Then, I would do an analysis on who is ranking for those keywords. Then, I would start developing new internal pages on the site as well as new content targeting those keywords. Start sending out guest blog posts, press releases, and link bait as ways to gain new inbound links. Build these more specific keywords into your site. You should be on your way to higher rankings as long as your links are coming from quality websites. You can also work to improve your on-page SEO even further.