Here is the article I was referring to: https://support.google.com/webmasters/answer/1269119?hl=en
- Home
- Millermore
Millermore
@Millermore
Job Title: President, Online Marketing Consultant
Company: Millermore
Favorite Thing about SEO
Watching my rankings rise
Latest posts made by Millermore
-
RE: Cleaning WP theme 404s in GSC
-
RE: Cleaning WP theme 404s in GSC
I should've said the "Remove URLs" tool instead of the Disavow Tool. Yes, Disavow Tool is to disavow incoming links that you don't want. The Remove URL tool is to remove content from Google, but I went through their little page about how to use the Remove URL tool and it says don't use it to get rid of content that doesn't exist anymore, and that Google will naturally find it. Well, how long does that take? Months? And what happens if I do use it? Ugh, this is very annoying as it is affecting a lot of my websites, and I don't know how much of an impact these Crawl Errors actually have on my site. Again, I understand the value of links that people are actually linking to, but this is more like hidden content that Google found, which I've gotten rid of, but they're still looking for it. Any help is appreciated.
-
RE: Cleaning WP theme 404s in GSC
The pages exist, but they are unpublished drafts, not accessible to the public. I have marked them as fixed and they keep popping up.
I've checked the site and I'm not linking to them on any of the pages that are live. It just seems like before I marked them as drafts, Google spotted them and is still looking for them. They were never in any sitemap I've submitted before, so I'm confused by this. I've also opened up a thread in the past regarding why some 404 crawl errors come up for desktop, and why different ones come up under Smartphone.
-
Cleaning WP theme 404s in GSC
I'm trying to clean all of the Crawl Errors for my sites, and I've reached the point where I've become slightly confused. A lot of these pages that come up in Crawl Errors aren't being linked to anywhere. The ones I'm referring to are mostly pages that came with a theme that I'm using - part of the demo content - which I've since set to Unpublished Drafts. I'm not linking to these pages anywhere on any of my Published pages, yet Google is still looking for them, still showing them in Crawl Errors as Not Found.
I'm assuming that Google found these pages at some point and can't find them now. I'm not sure if I'm supposed to keep setting up 301 redirects for these, or should I use the Disavow tool for these pages? I want to tell Google to forget these pages completely because I never intended for these pages to be indexed.
This happens for just about all of my Wordpress websites in Google Search Console. Can someone please shed some light on this? If there are any articles on this problem, please share! Thanks!
-
RE: URL Errors for SmartPhone in Google Search Console/Webmaster Tools
Looks exactly the same on my phone as it does on desktop. The pages coming up as 404s in GSC under Smartphone are NOT listed on this page-sitemap.xml page.
-
RE: URL Errors for SmartPhone in Google Search Console/Webmaster Tools
I am using Yoast but I am only using the Page sitemap, and it is the only one I have submitted for the affected sites. Again, this doesn't really explain why it's coming up under Smartphone and not Desktop. Also, Google does tell you where these are linking from, and it does say the sitemap page, but when i looked at the sitemap page, these pages are not listed. I am looking at them on my desktop though and not my smartphone. If I looked at the /page-sitemap.xml page on my phone would it look any different? :::quickly picks up his phone and tests it out:::
-
RE: How long will this Site be punished? (place your bets!)
It might be worth it to switch domain names at the end of the day, depending on how important it is to you.
Fortunately, I've never had to do it, but I've read a million times that after you disavow links and try to get rid of bad links the best you can, at some point, after you've literally attempted to get rid of each and every link that has the potential of causing you issues, you have this option to write a "reconsideration request" to Google.
Read this guide, it's great, and the reconsideration request is mentioned in Step 7: https://moz.com/blog/ultimate-guide-to-google-penalty-removal
Best of luck! Let us know if anything changes in the future!
-
RE: How long will this Site be punished? (place your bets!)
I don't think anyone can give you a real answer to that question. Any answer would be speculation. There's also not enough info here.
Have you gone through Google Search Console and looked under Manual Actions? How does the rest of GSC look in terms of errors and such?
I would try using the Moz Span Analysis tool and see if you missed any other bad links.
It sounds like it could be possible that Google is manually banning you from the first page. I don't know if that is a real thing or if they actually do that, but I suppose that could be possible! Have you tried sending a message to Google explaining all the steps you have taken to try to remove all spam?
-
RE: URL Errors for SmartPhone in Google Search Console/Webmaster Tools
I'm seeing similar issues. I was going to post a question, but found this when I searched.
I'm using Wordpress and I have some theme pages that I have set as drafts, so that I can access them on my end, but the public would get a 404. In Google Search Console, under Desktop, none of these pages are coming up. But under Smartphone, somehow Google is finding these unpublished draft URLs as 404 Not Found errors.
My question is why is Google seeing these pages, and also why it's triggering on Smartphone only and not desktop. And my last question is the obvious one: how do I fix this!?
Thanks!
-
RE: Net Neutrality: FCC Votes To Make Internet Public Utility
Here is Mashable's article on what's next: http://mashable.com/2015/02/27/net-neutrality-whats-next/?utm_cid=hp-hh-pri
Best posts made by Millermore
-
Net Neutrality: FCC Votes To Make Internet Public Utility
It sounds like it is now official: the FCC has voted to make the Internet a Public Utility, supporting Net Neutrality. But before I jump for joy, I'm asking myself, "What exactly does this mean?"
I know what it doesn't mean: ISP's won't be able to throttle data, and they won't be able to package together access to websites for additional fees, as they do with television channel packages. That's great, in my opinion. Even though I'm a Libertarian and I believe strongly in freedom, I know that that would have a seriously negative impact on the Internet, and especially for people like us who rely on it on a daily basis as our livelihoods.
The problem that I am find is that what they voted on contains ~322 pages of new regulations for the Internet. I have no idea what is in those 322 pages, and I doubt anyone who voted on it does either. The democrats are loving it, while the republicans are calling it, "Obamacare for the Internet". My mind works more in the direction of, there must be pros and cons. I'm just very curious as to what those pros and cons are, and what this will actually mean for us in the online marketing industry, as well as anyone who works on the Internet.
I'm not looking for any answers, and I'm especially not looking for a political or biased debate. But I think there should be a place where we can discuss this issue, because it has the potential to be extremely important to us.
Please share you thoughts, findings, and research here where we can discuss them. I'm looking forward to learning from you all, and I hope I can add some useful insights to this conversation.
Once more: please, do not turn this into a political debate - this is not the place for it. Please keep it to how Net Neutrality and Internet as a Public Utility will affect the Internet and Online Marketing landscape for us, our clients, and our customers.
-
RE: Ranking keyword ecommerce product
I recommend you do some keyword research using the Google Keyword Tool or Planner and see what people are searching for. See how many people are searching for it with the space vs without the space. That should give you a good idea. You also need to see if you'll be able to rank without the space by checking the competition level, but also the keyword difficulty using Moz tools.
There may also be other alternative formats that people are searching for that you can take advantage of. For example, maybe there's a bunch of people that aren't using the last 3 numbers? Or maybe there are a lot of people who just type in the model number without "New Balance" in front of it. Doing good keyword research will help you decide what keywords to target.
You can also use Moz's SEO Beginner's Guide where they will show you all about keyword research and what to do afterwards. There is also a great post on the 3-tier keyword system, along with a video at Moz.com/academy. I highly suggest you check those out.
-
RE: Help with homepage SEO please
I took a look at the site, and you're right, there are a lot of links in that mega-menu. However, it looks like you can simply reduce the amount of links in the mega menu itself by restrategizing how you are structuring the links and pages. For example, if you go the the Franchise tab and look under Get In Touch there are 6 links right there you can get rid of as all 6 are going to the exact same page. You don't need 6 links like that going to that one page which some of them only have one line of text. I recommend going through all of those links and seeing which others are like this and how you can reformat the links themselves. It looks as if a designer decided how to structure it without thinking about SEO and wanted to put as many links as possible to make the mega-menu look more complete. I would explain to the client that the designer made a mistake and that you need to rethink the links.
-
RE: Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
I would agree with the other two commenters here, you don't need to worry about duplicate meta descriptions, but each page needs to be unique to a certain extent. I'll try to add something different to this discussion: If we're talking to Google and Matt Cutts, and we're interested in white-hat only techniques, then I don't think he would suggest you create so many different pages if they aren't going to be very different. If you have many pages that aren't very different, than what value is that giving to the user? Or are you actually attempting to game Google (black-hat) by creating all these pages strictly for SEO purposes? If so, perhaps you should reevaluate your strategy.
However, if each and every location and topic is different and contains unique content such as completely different schedules and topic content, then I don't think you should have much to worry about. Just make sure that the actual content of each page is unique. Once you start creating dozens of duplicate pages, it may make more sense to try and figure out a simpler way to build out your site. You can try to balance and compare the risk of duplicate content to the benefit of having so many pages. Just focus on different content for each location and topic and you should be fine. In fact, Moz will tell you if you have duplicate content in your Crawl Diagnostics.
-
RE: Scheduled Custom Reports Not Running
I reported similar issues with many of my reports. They told me they are dealing with DDOS attacks which are affecting the reports.
-
RE: Help with homepage SEO please
Here's a video from Matt Cutts himself discussing how many links you should have on your page, and if there is a limit.
-
RE: Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
I totally understand what you're trying to do. What I'm trying to say is that they may be another way to get this location specific information to your users. Perhaps if you had one "sharepoint training" page, you can include all the locations there, with a schedule that changes if you hover or click on a location, but keeps you on the same page. This would likely be much safer with Google and would reduce the amount of work significantly. However, you may be losing potential SEO value without the individual pages for each location. Again, it's a balance, if you are able to create the pages without them being seen as duplicate content, then you're safe. If you can't make them unique, try to think about another method.
-
RE: Error reports showing pages that don't exist on website
For me personally, on Wordpress I use the Yoast SEO tool and I went through the tutorial on the Yoast website. He shows you how to eliminate a lot of the duplicate content that automatically gets created with all Wordpress websites. Once you noindex and get rid of all the unnecessary archives and all that, at that point I would recommend going back to the error report and see the difference and see if those pages keep coming up. If they do, just simply 301 redirect them to another page on your website. Then check again after you redirect them and see what you're left with. Sometimes it takes a couple weeks to reflect from what I've seen. Not sure if this is the exact issue you're having, or if you're even using Wordpress at all, but it sounds like if you are this might help you as it helped me get my errors down to zero.
-
RE: How/why is this page allowed to get away with this?
Cyto,
I like your thinking on this one. This is where I was trying to go with it. But still, you asked many of the same questions that I asked. I realize we won't have a solid answer unless Matt Cutts himself speaks on this specific issue. However, I'm still left with unanswered questions. Here's a few points that are left standing:
- I realize there are billions, if not trillions, of websites and pages in existence. However, there are not billions of pages who are at a PageRank of 7. You can try to disregard their PageRank and tell me how it's going to be deprecated soon, or it's not accurate, or whatever. But regardless, they got that page to a PR7. If you think that doesn't matter, and it's not important, I'd like to see you try to get your page to PR7 and tell me how long it took you to do that. What I'm saying is I don't think they magically got to PR7 overnight, and I don't think that Google has missed this site. There's only so many PR9, PR8, and PR7's out there. What are the chances that they completely missed AND messed up on the PageRank for this site? The only other explanation I have for the PageRank is that they were white-hat for a long time, and then when they got to PR7, they flipped to this black-hat type of page. But I doubt that's the case. They're either still benefitting from black-hat techniques, OR we are misjudging this site and Google actually does think it deserves a PR7.
- Try thinking about it like this: yes, this page is practicing many things that are straight-up black-hat, things that Matt Cutts has publicly and openly said is considered spam. Just simple things like a text/html ratio, or a certain number of links per page, or asking to trade links, or having massive links without nofollowing any of them. What if Google saw this page and said, wow this is a black-hat page, let's penalize them. And let's assume that this page is penalized. But what if all the sites on there are linking back to this page, and therefore all the link juice from the other pages pointing back to this page is basically that much more powerful than the penalization that it's basically overpowering the penalization with more back links, thus bringing them to a net PR7? The question here is: can you overpower Google's penalization with more bad back links?
- Looking deeper into the whole .org/non-profit/maybe Google likes these types of pages, perhaps they do and we're all just wrongly assuming things. In this case, I agree with Cyto, this page could be unique and it does benefit the user. However, isn't this the exact scenario that Matt Cutts has told us to implement a noFollow tag? I believe he has said repeatedly, if you must link to another site and you're not sure about it, just put on a noFollow tag. If you have reciprocal links, no need to get rid of them, just simply nofollow the links. It's this sort of thing that is giving me trouble fully accepting that this is a good page and Google likes them. And IF Googles does like this page, and the PR7 is deserved, and the followed links are fine, then I SHOULD try to get my client a link on this page. But I suppose there is a risk because we won't 100% know for sure unless Matt Cutts says so.
- Diving deeper into the "Google may like this sort of page" for the reasons you stated, it sort of contradicts what has already been said from Matt Cutts. For example, if I put a link in a press release back to my homepage, there is some value in that link to the user because it makes it easier for the user to visit that page simply by clicking instead of typing in the URL. In this case, all PR links have been nofollowed across the internet. You can use this same excuse to use a link, and say it creates value, but Google is telling us to noFollow these links. Especially when talking about a "directory" specifically, I have read that Google is shutting these sites down completely. However, we are left wondering if this specific site is on some sort of "white-list". In that case, the first-person to create a "directory of white-listed directories of followed links" I'm sure will be quite successful with that page.
- What is stopping me from creating a .org page similar to this? Why can't I build a page up to PR7 and openly exchange links with people? The biggest thing stopping me from even thinking about something like that is because I am assuming this only worked 5+ years ago. Regardless, I have a client who sells a few unique products, and one of their competitors is Quirky.com who led me to finding this page because they have a back link from this page. The problem I'm seeing is that Quirky.com is benefitting from a link on this page, and I'm worried about joining it due to a potential penalization. In this case, Quirky doesn't really have to worry about anything because they have so many links, and they're established. But if I wanted to get the same link as them, I have to worry. This is the sort of thing that makes it hard to compete with the big players. Not that I think this client is on par with them, but I just get the feeling that they're allowed to do more than we are. Perhaps I'm wrong, but it's the feeling I have.
- It's getting harder and harder for me to find white-hat followed link opportunities. It seems like everywhere I go, the link is going to be nofollowed. Other people's websites, they want to noFollow the link. Guest blog posts, they want to noFollow the link. Press releases are all nofollowed now. The case is either the link is noFollowed, or you risk penalization on a followed link. This is the corner I feel I'm getting pushed into.
- I learned a while back from an SEO that links are the most powerful form of currency in the SEO world. A link is the number one most powerful way to get higher up in the rankings, for the reason that it is basically a sign of saying "this site is trustworthy and worthwhile to check out" and Google puts those things together to say they are worthy of higher rankings. And it all makes sense to me, and I haven't seen anything to tell me otherwise. If I'm wrong and I missed something, let me know. I mean, it's great to put out unique content and all that, but what is the point of the guest post or the press release if there's no indication that you wrote it or that it has anything to do with your site? What is it worth at that point if there is no link included? I understand the organic side that some people may literally read it and visit your site off that, but that's an inefficient way of doing things. I'm down with "link-earning" but only if I can actually earn a followed link. What's the point of a link-earning process if you don't earn the link, know what I mean? It just seems like everything is going this way of noFollowing links, or you have to worry about a penalization. And before you say it, I am aware that it's less than 20% of all links that are noFollowed, but still, this is the feeling I'm getting. (That number may be higher now that all Press Release links are no followed, not sure)
- I'm really not trying to do anything black-hat. I'm trying to do white-hat stuff here, but with the purpose of accelerating my client's process of getting higher in rankings. Listen, I'm doing all the other stuff well, it's just this whole link-building/earning aspect is tough and it seems like 2014 is going to be much harder than previous years.
What are your thoughts on these points?
-
RE: Error reports showing pages that don't exist on website
Rel=canonical is used more when you have duplicate content. If you have the same post or page in two areas, you can use the rel=canonical tag to tell Google where the original of the duplicate is. It sounds like you don't need rel=canonical in this situation.
It sounds like you have 80-something 404 Page Not Found errors. I would use the "Redirection" plugin with Wordpress. Take each URL that is giving you the 404 error in your report, and redirect each one to the most relevant page associated with what was supposed to be on the page that is giving the 404 error. If there really is no relevant page at all, I would just redirect it to the homepage. In my opinion, it's better to have it redirect to the homepage than to have the user land on a 404 page. I would do that for every 404 error you are getting. Doing this, I don't think you'll need rel=canonical at all.
Entrepreneurial spirit, passion for online marketing
Looks like your connection to Moz was lost, please wait while we try to reconnect.