Hi and welcome!
I'd check out these two links. They have a ton of how-tos and videos and FAQs... I reference these ALL THE TIME!
Hope this helps.
Mike
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi and welcome!
I'd check out these two links. They have a ton of how-tos and videos and FAQs... I reference these ALL THE TIME!
Hope this helps.
Mike
Hi Jess,
Using Screaming Frog, it looks like your /blog page actually has 131 links. If you add up your footer (30), plus links to your homepage (6), plus pagination (9), plus Link Building and Content article (5), and your Alex Bogusky Video article (6) - you already have 50+ and that is not including top and side navigation, as well as the rest of the articles on your page.
Matt Cutts sums things up really well in this article saying:
"...Google will index more than 100K of a page, but there’s still a good reason to recommend keeping to under a hundred links or so: the user experience. If you’re showing well over 100 links per page, you could be overwhelming your users and giving them a bad experience. A page might look good to you until you put on your “user hat” and see what it looks like to a new visitor.
But in some cases, it might make sense to have more than a hundred links. Does Google automatically consider a page spam if your page has over 100 links? No, not at all. The “100 links” recommendation is in the “Design and content” guidelines section, and it’s the Quality guidelines that contain the things that we consider webspam (stuff like hidden text, doorway pages, installing malware, etc.). Can pages with over 100 links be spammy? Sure, especially if those links are hidden or keyword-stuffed. But pages with lots of links are not automatically considered spammy by Google.
So how might Google treat pages with well over a hundred links? If you end up with hundreds of links on a page, Google might choose not to follow or to index all those links. At any rate, you’re dividing the PageRank of that page between hundreds of links, so each link is only going to pass along a minuscule amount of PageRank anyway. Users often dislike link-heavy pages too, so before you go overboard putting a ton of links on a page, ask yourself what the purpose of the page is and whether it works well for the user experience."
Hope this helps.
Mike
Hi Christina,
For the location that you actually have an office, I would suggest having a single page devoted to that particular location (landing page as Bereijk suggests). Then I would use GetListed.org and list that physical location's information in all of those business directories.
I would then create landing pages for the other locations using a similar structure seen in Perfecting Keyword Targeting & On-Page Optimization.
That said, if the particular search terms you are going after include local results, only queries regarding your physical location will appear in the local results. So for instance is someone searches "SEO" from say Essex and your physical address is in Essex, and if Google applies the local SERP, your address and site information may appear; however, for the location landing pages you create, someone would have to do a search for "SEO London" to find your SEO London page, because it does not have a physical address located in London... does that make sense?
(the above is just from my personal experience. we did this for a ton of different locations at a previous job I held. in my opinion it was a waste of time, because my particular industry [at the time] would not have searched for "ERP software los angeles" - because software is not part of the localized SERP [at least at the time] so no one was even making it to those pages)
Hope this helps (and sorry for rambling),
Mike
If you go to your Campaign and under the Rankings tab you can export your full rankings report to CSV. Once you have that report, you can quickly copy and paste the keywords from that campaign across your other campaigns.
Hope this helps.
Mike
Hi Oliver,
In the research I have done on this subject, between 2005 and 2009 the NoFollow directive was used to preserve link juice and keep it from passing through to the particular link. This would allow you to reallocate that link juice to other, more important links on your page. It was mainly developed to discourage comment posters from spamming sites with misc. links; however, web masters soon figured that you could "link sculpt" your site... more or less hoarding your link juice and only pushing it through to your really important pages... making them rank higher.
In 2009 things changes. Matt Cutts commented on the subject saying, "So what happens when you have a page with “ten PageRank points” and ten outgoing links, and five of those links are nofollowed? Let’s leave aside the decay factor to focus on the core part of the question. Originally, the five links without nofollow would have flowed two points of PageRank each (in essence, the nofollowed links didn’t count toward the denominator when dividing PageRank by the outdegree of the page). More than a year ago, Google changed how the PageRank flows so that the five links without nofollow would flow one point of PageRank each." - more or less stating that link juice is discarded and not reallocated.
Matt Cutts also says, "Nofollow links definitely don’t pass PageRank." - So while the PageRank flows through the link, it is discarded, and is not passed to the linked to page.
"The essential thing you need to know is that nofollow links don’t help sites rank higher in Google’s search results." - Matt Cutts
I don't know if the subject of the article you were reading was about sculpting your site, but depending on who you talk to, there are many other tactics that can give you a higher ROI.
I hope this helps answer your question.
Mike
Personally, I would try and mix it up. From a user standpoint, if I am on your Google+ page and am wicked impressed, then decide to visit your website and am greeted with the EXACT SAME TEXT... boring. It would seem to me as a user that you just slapped up the Google+ site just to have it.
That said, Matt Cutts and SearchEngineLand.com say, "...duplicate content issues are rarely a penalty. It is more about Google knowing which page they should rank and which page they should not. Google doesn’t want to show the same content to searchers for the same query; they do like to diversify the results to their searchers."
So will you get into trouble - No.
Will Google rank your homepage higher than your Google+ page - It should.
Will your users have a fantastic experience - Probably not.
Hope this helps.
Mike
Hi Jesse,
It all depends
Depending on the number of pages, the number of layouts, whether they need to write copy, perform usability testing, etc. And depends on the type of platform they are developing on OR if you are going to get a new platform.
The price changes with the number of iterations needed to layouts/design and the time frame you are looking at.
It also depends on whether you have a freelancer or agency do it. Without knowing the complete scope, I would guess somewhere between $50k-$250k if it is a pretty large site. If it is only like 50 pages and you only did a few templates, you could maybe get away with spending $10k-$30... depending on the people you work with.
First, I would look at design firms available in your area. Check out the websites they have developed and contact someone at the company they developed for and ask them about their experience. Do this for a handful of firms. Then contact each firm and ask them for a quote. After you have quotes from firms, go out and ask some freelancers... look at their work... ask for a quote. You will probably have to have some sit down, in person interviews with these people to make sure that they really get what you are asking for.
I know that doesn't completely answer your question, but hopefully it gives you a few things to think about.
Mike
Hi David,
It sounds like you did everything right. As long as you cannot see the dev site in the live SERPs, you should be safe.
I have seen it take MONTHS for WMT to update any information regarding crawl stats, resolved errors, etc.
Just keep a close eye on things, but you should be safe if you can't find it by doing a Google search.
Hope this helps,
Mike
PS you may want to check Bing as well.
When you are in your campaign settings, under Manage Keywords, place a check in the box(es) adjacent to the keyword(s) you want to associate a label. Then, use the Add/Clear Labels dropdown to select a current label or create a new one. Finally, click Add Labels to add the selected labels to the selected keywords.
Hope this helps.
Mike
Hi David,
This typically happens when your PDFs are full of content people want. I have seen this personally with a previous company I worked with and their spec sheets (it was for a software copy). This is good and bad - good people find your content awesome and are linking to it - bad that your PDFs are ranking vs your pages.
Solution - make your pages better. You could potentially take the content from your PDFs and make a page - this in theory should still compel people to link to you. If your PDFs are massive in size, you could consider condense the contents into a webpage that contains FAQs, a summary of the information, etc.
This isn't a bad problem to have. If you can't beat them, join them - optimize your PDFs for search.
Hope this helps.
Mike
Here is a step-by-step of what you could try doing from Cypress North: Hacked? Here's How to Remove the Dreaded Google Malware Warning.
Google also says, "If you're the administrator of a site we've identified with this warning message, learn how webmasters can fix this issue. Note that in some cases, third parties can add malicious code to legitimate sites, which would cause us to show the warning message."
Good luck.
Mike
Crazyegg measures how far people are scrolling down on your site (there website is down right now... which is odd). I have used them before and you get some really good data.
Clicktale is another one that can not only do scrolling heatmaps, but it can also do actually recording of how your visitors are interacting with your site. They have a free and paid version. I have not used them... yet... but am looking at implementing this over the next few months.
Hope this helps.
Mike
I would check out Wistia - "SEO embeds use old object tags so they're crawlable by those creepy search engine robots that make you Internet famous. SEO is inconsistent with IFrames; Google's official stance is that they don't crawl IFrames, although sometimes things like an image still or link might sneak their way in. These also use noscript tags to make sure transcripts are visible to everyone. However, they lack the universal compatibility of IFrames."
And they just added backlinks if someone embeds your video too.
Hope this helps.
Mike
At this point, no - there are not positive or negative effects you'd need to worry about.
If you are wanting to switch from one to the other, you will want to make sure you you have the proper redirects/canonical tags in place.
One way to choose which option is best for your situation would be to check your analytics and see which site is getting more traffic (if both the http and https versions of your site are accessible).
Matt Cutts did make a comment earlier this year that he personally would want to give a boost to secure sites - but that is not currently in place.
Depending on your audience https may be more appropriate. For instance, if you Google "banking" - you can see both http and https websites ranking - savvy users will probably notice the secure sites and visit those. I guess it all depends on your industry and your audience though.
Hope this helps.
Mike
Hi Aditya,
That is personal preference. Search engines used to associate paths ending in / as a directory and paths not ending in / a file. Here is a great article on the subject from Google Webmaster Tools Blog.
I would say optimum is 3 to 4, but depending on the size of your site, you could probably go up to 6 without too much of a problem.
URL.
Hope this helps,
Mike
Yes. That is considered duplicate content. You common URLs you should check for include:
I might have missed a few more, but those are the ones I usually look for.
You can review this Google Webmaster Tools article on how to fix duplicate content issues by using 301 redirects, selecting a preferred domain in GWT, using the canonical tag, and more.
Typically, not in all cases, you have an http and https issue because you have a secure page on your website that uses relative links instead of absolute. So when you go from http to signin on an https, from that page all links point to https page versions instead of the http which the visitor came in on.
Hope this helps.
Mike
Hi Joel,
Sounds like you have everything setup correctly so far with the canonical and GWT.
Next step would be to do a 301 redirect from the https version to the http version. After that and your changes above, it should get remove from Google's index.
Note: You may also want to run a scan with a program like Screaming Frog to ensure that you are not accidentally linking to the https version somewhere on your site or in your sitemap.
Hope this helps.
Mike
You should try and get your business listed in all of the directories listed on https://getlisted.org/
Hope this helps.
Mike
While your internal links are pointing to the non-www version, you have 14 linking root domains pointing to your www version.
You can you opensiteexplorer.org to get this info.
Also, check out the best practices for canonicalization to help you fix this issue.
Hope this helps.
Mike
Hi Nenad,
I guess it all depends on your marketing strategy. If your company is THE watch company and people are mainly coming to your website for watches and nothing else, then it would make sense to move them to a subdomain/new domain and 301 redirect from the old pages to the new ones.
But if your company is THE watch company that is now expanding into other daily accessories (ie: leather wallets, bags, belts) you could be transforming your company into THE accessory company. You would simply have a landing page for watches, one for wallets, one for bags, etc.
Does that help?
Mike
Ideally you'd want to have different landing pages for each Indianapolis, Columbus Ohio, and Louisville KY.
You'd do onpage optimization on each of those landing pages for your service and the location (including the physical address and phone number of your office location), as well as offpage optimization in the business directories on GetListed.org.
As far as examples, you should analyze your competitors that are winning in your space and improve upon what they are already doing to get to where they are.
Hope that helps.
Mike
From those analytics, it almost looks like he started his website in Jan 2013... if that is the case, it can take some time to establish authority and ranking. If your rankings increased, then it makes sense that your traffic would also increase.
For a site that I optimized, we saw some immediate results; however, after 3 months of building up authority, rankings shot up and so did traffic.
It is tough to say exactly what is causing that... looks like you must have done something good : )
Mike
I would think that the only struggles you may have would have to do with the company's industry more than the DA of the site. You would have the same "struggles" battling for a top position if the site had a DA of 23 and the competitors had a DA of 20, 21, and 22.
I would suggest that you spend some time doing some competitive analysis. Not just top level, "Who has higher DA and more links, or more likes, etc." but, who has better content, better meta descriptions (doesn't help SEO, but helps in getting users to click on your site in the SERPs), a deeper content library, etc. Use Google to search your competitors and find every web page, PDF, Word Doc, Presentation, Video, etc. and then compare their inventory against your own. Once you see what is working for them, put your own spin on it and make it better OR just make it current if they have great content, but it is out of date.
When you are dabbling in the lower DAs, basic SEO modifications are what makes biggest/fastest change, but when you are dealing with the big dogs - sometimes it is more difficult to move the needle, so you need to think of the little things like giving users a more compelling meta description, giving them better, more relevant content.
I would suggest that you:
Traffic and rankings are relatively easy. Ensuring that you are providing each and every visitor value is not - this is what I would suggest you figure out how to do.
Hope this helps.
Mike
Hi Fabrizo,
Shorter URLs are preferred. In your example I personally feel it makes sense to remove download and indici from your URL struture, because that information does not mean anything to me, the user (maybe it does to someone who plays violins... so that statement might not be true, but to me that information is irrelevant).
Now are you going to go from page 2 to page 1 with a change like this... maybe? There aren't any hard facts that will say how big of an impact that will make; however, the closer a keyword is to your domain in terms of your URL, the more weight is assessed to it in terms of keyword association - if that makes sense. So...
domain.com/downloads/Indici/Violin.html appears less relevant than domain.com/violin/ if a user is searching for "violin".
Also, Dr. Pete wrote an article a few years ago on changing URLs for SEO purposes. I think that all of the information still stands true for today.
Hope this helps,
Mike
Kind of, but only for 24 hours via an iframe: "The reviews response group returns the URL to an iframe that contains customer reviews. You can embed the iframe on any web page to display the response content. Only the iframe URL is returned in the request content." - Product Advertising API
Hope this helps.
Mike
Hi Ash,
Like Tom, I am a huge fan of Screaming Frog.
Another option is Xenu's Link Sleuth which is completely free, while the Screaming Frog free version has a 500 URI crawl limit.
I personally think it is worth it to purchase the licensed version of Screaming Frog... but I also need to for my site.
Good luck.
Mike
It appears that your problem is that you have 2 canonical tags. One referencing the desktop site and one referencing the mobile site.
Remove the /mobile/clearwater/bankruptcy-clearwater-lawyer-attorney.html tag from that page and you "should" be good to go.
Hope this helps,
Mike
You can enter up to 20 keywords at a time, separated by a comma, tab, or new line.
Hope this helps.
Mike
Hi Dana,
It looks like some of the *.cat links are still being used through your site.
For instance, /StoreFront/B8I.cat is actually 301-ing to /StoreFront/category/B8I.cat and /StoreFront/B8I.cat (with anchor text "headphones or earbuds") is being used on the /StoreFront/category/controlling-the-low-end-on-your-stage page.
The /StoreFront/category/controlling-the-low-end-on-your-stage page is at level 7 in your site and is linking to /StoreFront/B8I.cat, making it at level 8, which then 301s to /StoreFront/category/B8I.cat, keeping it still on level 8 because of the 301, and finally redirecting with a 301 to StoreFront/category/headphones (which is a level 1, because it is accessible from the homepage (level 0) navigation).
Does that make sense?
It did in my brain... just let me know if it does on paper : )
Hope this helps.
Mike
Hi Margarita,
I have encountered similar issues when you need to enter a password or where the site might require cookies.
Here is what Screaming Frog has to say on the topic:
"13) Why Won’t The SEO Spider Start? Top ↑
This is nearly always due to an out of date version of Java. If you are running the PC version, please make sure you have the latest version of Java. If you are running the Mac version, please make sure you have the most up to date version of the OS which will update Java. Please uninstall, then reinstall the spider and try again.
14) Why Won’t The SEO Spider Crawl My Website? Top ↑
This could be for a number of reasons. Before contacting us, please check your robots.txt to see if the website is blocking the SEO spider. Please also ensure the website is html (the SEO spider does not crawl framesets), that it can be crawled without JavaScript or requiring cookies. Please also check for the ‘nofollow’ attribute if certain links are not being crawled. There is an option in the configuration to crawl ‘nofollow’ links."
Also, SEER has a great article about how you can use Screaming Frog 55 different ways - just some good info.
Hope this helps.
Mike
Hi Ruben,
I believe what Matt Cutts was getting at is that guest blogging, for the sole purpose of SEO is going to get you penalized.
If you are guest blogging to drive awareness of your brand, provide thought leadership, etc. is still something that is accepted an encouraged.
SearchEngineWatch.com has a great article regarding Matt Cutts' comments on guest blogs, as well as a quote from Ryan Jones, "Guest blogging can still work. You wouldn't turn down a column on CNN or an editorial in the Huffington Post if they said you wouldn't have a dofollow link would you?"
Hope this helps.
Mike
Hi David,
This is completely normal. My understanding why this happens is that
Just to name a few.
You shouldn't have to worry about anything UNLESS you see a big drop that is constant. If you move from position #4 to #24 and it stays like that for over a week... you may need to investigate things further; however, if you are just seeing some bouncing around, but are usually maintaining a good position, I would not worry about it.
Hope this helps.
Mike
Hi David,
In looking at your source, it looks like your meta description is, "Buy Trust Deeds from Trimark Funding Inc. - Buy Trust Deeds for High 9% Annual Returns - Consistent Monthly Income."
The keyword is in the meta keywords tag, but not in the meta description tag.
Hope this helps.
Mike
You should be able to use the Google Places API to include ratings and reviews.
Or you can check out how to Embed Google+ Local Business Reviews On Your Website In 3 Easy Steps - which might be easier than tinkering around with their API.
Hope this helps.
Mike
Hi Sean,
Your system looks pretty decent... I guess it depends on your traffic load though.
Have you checked out your results on GTMetrix.com?
You scored well in the PageSpeed testing, but not in YSlow. I'd take a look and see if you could leverage some of the suggestions there.
Hope this helps.
Mike
Hi Mark,
I disagree with a lot of the points in that article. Sliders/Carousels can be done well. In many cases, I don't think brands are trying to rank for their homepage for anything other than their brand name - therefore, this strategy is serving its purpose.
IBM uses this strategy and is successful. SAP uses this strategy and is successful. And even Apple uses this strategy and is successful (I think I might have just blown a few minds with this bomb).
To your questions:
I think showing a representative sample of the logos for the companies you help is fine. Whether it is just 5 and they click a link to go to the dedicated page OR if you use a mini slider/carousel to show the wide range of companies on the homepage.
You could consider using a slider of sorts that would have a high profile company logo and a quote from someone at that company that is saying how awesome your services are... if you could get like 5 or so quotes/testimonials like that... that would be pretty awesome - in my opinion.
When done correctly, any of the text you use should be able to be indexed by crawlers, so no FLASH and no crazy difficult javascript.
If you think this is a differentiator and will help people select you as their service provider, then this is a great idea. If you are just looking to make you site look cool, you could just have a page accessible from your navigation that would like to a comprehensive list of company logos.
Hope this helps.
Mike
Hi Tim,
As of right now, you are correct in what you are suggesting. Google will not give rank to your redirect - meaning grape.popsicles is not going to appear in the SERPs.
This is a great strategy for other marketing purposes though (flyers, campaigns, etc.)... utilizing them as redirects. If you did get a bunch of sites linking to "grape.popsicles" though, that link juice would pass through a 301 to the final page.
Another option would be to create micro sites for each of the gTLDs that would then link back to your main www.company.com site. You would have to do this in a strategic way though... not just linking all of your micro sites to attempt and pass link juice around. For instance, your company site could be more targeted for parents and your micros be more targeted for kids with fun summer activities to do while enjoying your popsicles.
Still a good buy either way you decide to take it.
Hope this helps.
Mike
Hi Matt,
Bummer to hear that.
Contact the Moz Help team: http://moz.com/help/contact
Good luck,
Mike
I am guessing you must have accidentally set up a link to /win-a-party/ instead of /win-a-party.
Search engines used to associate paths ending in / as a directory and paths not ending in / a file.
This is a great Google Webmaster Blog article on the subject.
I would suggest you either 301 redirect it to the preferred version OR you use a tool like ScreamingFrog to find the incorrectly setup link/navigation and make the actual fix.
Hope this helps.
Mike
Hi Jesse,
Have you already looked at Google's tool for building mobile sites?
This also has some great resources www.howtogomo.com which also is a Google site.
Hope this helps.
Mike
Hi Alexander,
If not a lot of people are linking to them AND if the page has an expiration date of sorts, you'd probably want to 410 the page.
404 and 410 both tell Google that the page no longer exists; however, the 410 (GONE) is more specific than the 404 (NOT FOUND).
The 410 "should" naturally get the page out of Google's index faster as well.
Or, if it is an ongoing thing, you could consider changing your URL to /group-course and then just constantly change the content on that particular page, letting the URL accumulate inbound links and drive traffic... I don't know exactly how you are using the page, but just an option.
Hope this helps.
Mike
I spoke with some of the Moz team about this at MozCon 2013 and from the sounds of it they are moving in that direction - where people can create Moz accounts and admins can great access to their analytic data.
I wasn't given a time frame on the enhancement... hopefully soon!
Mike
I am a little confused by your question, but let me see if I understand what you are saying:
You have a product page, then you have any accompanying pop-up for it, and you just changed all of the URLs for the pop-ups and the old pop-up URLs are redirecting to the product page?
Were you getting a lot of people linking to the pop-up URLs? That is really the only reason you'd need to 301 them (in my opinion). If not, then ONLY redirect the popular pop-up URLs. This would definitely help with load times.
And your dev team is probably correct. If you have 100,000 lines of redirects, that is going to slow things down. Depending on your environment, you could potentially set up rules that would redirect things; however, it may be difficult depending on your URL structure/pattern.
As far as 410 or 404, a 410 usually alerts Google to remove the URL from its index faster than a 404, but in the end they essentially work the same.
Hope this helps,
Mike
Go find a college kid who is good at webdesign. I bet you could hire him/her yet for the summer and you could get them wicked cheap. If you only have 30 pages, you'd probably only need a handful of templates/layouts. And since you are going to use a free CMS, you could probably get away with only paying a couple grand.
Good luck.
Mike
It definitely can.
Typically your homepage is going to get a bulk of your inbound links and traffic - giving it more authority than your subpages. If you target the same keyword on your homepage or a subpage, your homepage will more than likely win every time (not in all cases).
If you have a new integration with a product, that page should target specific keywords related that integration/product. If there isn't enough differing text to warrant its own page, then you should take a step back and ask whether a new page is needed OR if you should just have a mention of it on your homepage.
All of that said, you could rephrase and/or write about things from a different angle than your homepage and you should be fine and not have to worry about duplicate content as long as you are not just copying and pasting content.
Think about your user and whether the page is actually benefiting them or if you are just creating a page to try and get it to rank.
Hope this helps.
Mike
"The ability to select Broad, Phrase or Exact match has been removed—only Exact match data is now available." - Google Keyword Planner pros and cons as well as, "But with Keyword Planner, you'll get historical statistics only for exact match." - Keyword Planner has replaced Keyword Tool.
As far as experiencing "off" volume results, there have been different reports around the web suggesting that volume may not be as accurate as it could be - Somethings Fishy in the State of Googles New Keyword Tool [Data].
Hope this helps.
Mike
Right on.
Yeah, I would just leave them as 404s and if they are indexed by Google, they will eventually be removed from the index.
Mike
Hi Craig,
Did you submit your sitemap in Google Webmaster Tools?
Mike
Hi Dave,
If I were in your shoes, I'd set up a rule to 302 redirect all of your https pages to their http equivalent until Tuesday when you get your cert.
This will make it so if anyone clicks on your https pages in Google's index that they will be brought to the appropriate page and not get a security warning.
The 302 will also tell Google "Hey, this is just a temporary redirect... don't worry about indexing things differently." - because if you get Google to index all of the http versions and if you don't 301 those, you will get a ton of 404 errors. Which is fine... but it gets messy fast.
Hope that helps.
Mike
I've heard really good things about Long Tail Pro, but haven't had the chance to really dive into it besides just playing around.
Mike