Questions created by jcgoodrich
-
Proper sitemap update frequency
I have 12 sitemaps submitted to Google. After about a week, Google is about 50% of the way through crawling each one. In the past week I've created many more pages. Should I wait until Google is 100% complete with my original sitemaps or can I just go ahead and refresh them? When I refresh the original files will have different URLs.
Intermediate & Advanced SEO | | jcgoodrich0 -
Proper naming convention when updating sitemaps
I have a semi-big site (500K pages) with lots of new pages being created. I also have a process that updates my sitemap with all of these pages automatically. I have 11 sitemap files and a sitemap index file. When I update my sitemaps and submit them to Google, should I keep the same names?
Intermediate & Advanced SEO | | jcgoodrich0 -
What to do about old urls that don't logically 301 redirect to current site?
Mozzers, I have changed my site url structure several times. As a result, I now have a lot of old URLs that don't really logically redirect to anything in the current site. I started out 404-ing them, but it seemed like Google was penalizing my crawl rate AND it wasn't removing them from the index after being crawled several times. There are way too many (>100k) to use the URL removal tool even at a directory level. So instead I took some advice and changed them to 200, but with a "noindex" meta tag and set them to not render any content. I get less errors but I now have a lot of pages that do this. Should I (a) just 404 them and wait for Google to remove (b) keep the 200, noindex or (c) are there other things I can do? 410 maybe? Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Where is the best place to put a sitemap for a site with local content?
I have a simple site that has cities as subdirectories (so URL is root/cityname). All of my content is localized for the city. My "root" page simply links to other cities. I very specifically want to rank for "topic" pages for each city and I'm trying to figure out where to put the sitemap so Google crawls everything most efficiently. I'm debating the following options, which one is better? Put the sitemap on the footer of "root" and link to all popular pages across cities. The advantage here is obviously that the links are one less click away from root. Put the sitemap on the footer of "city root" (e.g. root/cityname) and include all topics for that city. This is how Yelp does it. The advantage here is that the content is "localized" but the disadvantage is it's further away from the root. Put the sitemap on the footer of "city root" and include all topics across all cities. That way wherever Google comes into the site they'll be close to all topics I want to rank for. Thoughts? Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
How to structure links on a "Card" for maximum crawler-friendliness
My question is how to best structure the links on a "Card" while maintaining usability for touchscreens. I've attached a simple wireframe, but the "card" is a format you see a lot now on the web: it's about a "topic" and contains an image for the topic and some text. When you click the card it links to a page about the "topic". My question is how to best structure the card's html so google can most easily read it. I have two options: a) Make the elements of the card 2 separate links, one for the image and one for the text. Google would read this as follows. //image
Intermediate & Advanced SEO | | jcgoodrich
[](/target URL) //text
<a href=' target="" url'="">Topic</a href='> b) Make the entire "Card" a link which would cause Google to read it as follows: <a></a> <a>Bunch of div elements that includes anchor text and alt-image attributes above along with a fair amount of additional text.</a> <a></a> Holding UX aside, which of these options is better purely from a Google crawling perspective? Does doing (b) confuse the bot about what the target page is about? If one is clearly better, is it a dramatic difference? Thanks! PwcPRZK0 -
Going after multiple similar keywords, which is the better approach?
Let's say I have a page targeting a keyword, "New York Restaurants". There are also several "very close" variations of this keyword which I could also target. Here are the volume estimates: New York Restaurants - 100
Intermediate & Advanced SEO | | jcgoodrich
Restaurants New York - 40
Best Restaurants New York - 30
Best Restaurants in New York - 20
etc. Given this, which of the following is the better overall approach? A) Have one page and work all of these keywords so the page targets all of them. For example here try to weave in "Best" in different ways. B) Have multiple pages and use 301 redirects. Create one page only targeted at "New York Restaurants" and then create additional pages with the other terms in the URL and Headline, which 301 redirect to my "New York Restaurants" page. This is similar to how wikipedia does redirects, for example "Bourne 2" 301 redirects to "Bourne Supremacy". Thanks! | New York Restaurants | 12,100 | Medium | $0.93 | 0% | ACCOUNT |
| Restaurants New York | 2,900 | Medium | $1.00 | 0% | ACCOUNT |
| Best Restaurants in New York | 3,600 | Low | $0.69 | 0% | ACCOUNT |
| Best New York Restaurants | 2,400 | Low | $0.80 | 0% | ACCOUNT |
| New York's Best Restaurants | 260 | Low | $0.76 | 0% |0 -
Does hiding responsive design elements on smaller media types impact Google's mobile crawler?
I have a responsive site and we hide elements on smaller media types. For example, we have an extensive sitemap in the footer on desktop, but when you shrink the viewport to mobile we don't show the footer. Does this practice make Google's mobile bot crawler much less efficient and therefore impact our mobile search rankings?
Intermediate & Advanced SEO | | jcgoodrich1 -
303 redirect for geographically targeted content
Any insight as to why Yelp does a 303 redirect, when everyone else seems to be using a 302? Does a 303 pass PR? Is a 303 preferred?
Technical SEO | | jcgoodrich0 -
What is the best way to execute a geo redirect?
Based on what I've read, it seems like everyone agrees an IP-based, server side redirect is fine for SEO if you have content that is "geo" in nature. What I don't understand is how to actually do this. It seems like after a bit of research there are 3 options: You can do a 301 which it seems like most sites do, but that basically means if google crawls you in different US areas (which it may or may not) it essentially thinks you have multiple homepages. Does google only crawl from SF-based IPs? 302 passes no juice, so probably don't want to do that. Yelp does a 303 redirect, which it seems like nobody else does, but Yelp is obviously very SEO-savvy. Is this perhaps a better way that solves for the above issues? Thoughts on what is best approach here?
On-Page Optimization | | jcgoodrich0