I think all the implementations work just about the same. We chose to do it in our sitemaps because that was the easiest for our developer to implement. You should choose one or the other, there's no need to do multiple implementations.
- Home
- john4math
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
john4math
@john4math
Job Title: Technical Marketing Manager
Company: IXL Learning
Website Description
A create-your-own site for teachers with quizzes and a multitude of activities.
I'm sarcastic & easily sunburned. I enjoy good beer & trail running.
Favorite Thing about SEO
Google Adwords
Latest posts made by john4math
-
RE: Adding hreflang tags - better on each page, or the site map?
-
RE: Are .clinic domains effective?
(This is all speculation as I've never done this before. There are probably people in the forum that have)
Be aware you're switching from a ccTLD to a gTLD. Is the clinic primarily for Canadian residents? In the simplest terms, switching from a .ca to a .clinic may hurt your Canadian rankings, and help your rankings everywhere else. If you want your site to continue targeting Canadians specifically, you can set that in your Google Webmaster Tools, although I think having the .ca domain itself is a stronger indicator to Google that your site is geared towards Canadians.
-
RE: Google indexing despite robots.txt block
It sounds like Martijn solved your problem, but I still wanted to add that robots.txt exclusions keep search bots from reading pages that are disallowed, but it does not stop those pages from being returned in search results. When those pages do appear, a lot of times they'll have a page description along the lines of "A description of this page is not available due to this sites robots.txt".
If you want to ensure that pages are kept out of search engines results, you have to use the noindex meta tag on each page.
-
RE: Geoip redirection, 301 or 302?
For geo-redirects, I do not recommend you use 301 redirects. Browsers can cache these, so if you tell a browser in Canada that example.com should redirect to www.example.com/ca-fr, and later the user changes their language to English, and then tries to go www.example.com, the browser could use that redirect again to go back to the French version without hitting your server. 301 tells the browser that www.example.com ALWAYS (permanently) goes to www.example.com/ca-fr. Page rank isn't really a consideration with these, since Googlebot always comes from the US, so it should never hit these redirects. If example.com always goes to one of the versions via a redirect (i.e. you don't serve content under that root URL), then you do have a bit of problem with redirects. You don't want to 302 Googlebot to another page for your home page, but at the same time, you want to avoid weird redirect behaviors for your customers.
Google can visit the international versions directly without redirects, right? They should have no problem indexing those pages then.
I agree with István, get some local links to your different local versions, register them each with Google Webmaster Tools (and Bing), put up sitemaps for each, and implement the hreflang tags in your sitemaps (or pages). That way Google can easily index each version, and knows exactly what each version is for.
-
RE: Question on noscript tags and indexing
Weird. We were having a problem where lots of our skill pages were getting our
<noscript>text used as page descriptions on Google SERPS. We added these comments, and Googlebot reverted to using our meta description as the page descriptions in SERPs. It could have been a freak coincidence that Google stopped using our <noscript> text right after we implemented the tags, or possibly Google was (possibly accidentally) supporting them for web search awhile back when we originally did this, and now has stopped supporting it. Anyways, our SERPS remain clean of our <noscript> text today (<a href="https://www.google.com/search?q=site:www.ixl.com/math/grade-5" target="_blank">example</a>).</p> <p>John Mueller recently commented on that Quora thread saying it won't do anything for web search, so IMO that puts this to rest.</p></noscript>
-
RE: Is it possible to set a Goal conversion tracking from a subdomain to a root domain?
You can set up the goals in the subdomains profiles if you want to view goals there. That's entirely up to you. Each profile is its own thing. When I say "filter", I mean write "example.com" into the search box there, and search by it. You can also click "advanced" next to the search box, and make the search more granular if needbe.
-
RE: Question on noscript tags and indexing
Can you try wrapping only the message about Javascript with the googleoff/googleon comments, and see what happens? It you don't have to put it around everything in the
<noscript>. I would agree that it sounds like the structure of your site is not ideal, but I'd try that first and see if it solves the problem.</p></noscript>
-
RE: Question on noscript tags and indexing
I had a similar problem, Google was picking up
<noscript>text and using it as the description for our pages in some SERPs. We didn't want to remove them, so we tried using "googleoff" and "googleon" tags, which are just HTML comments that Googlebot can read. You can read their documentation <a href="https://developers.google.com/search-appliance/documentation/68/admin_crawl/Preparing#pagepart" target="_blank">here</a>. We wrapped the text in the <noscript> with these comments, and it worked like a charm, so it does look like Google respects these tags.</p> <p>If I were you, I'd go ahead and add the syntax if it's easy for you to do (i.e. only have to add it a few places in the code, not in thousands). It's probably not great for your SEO that Google thinks your site is about Javascript. Or you can do what Frederico says and remove it. Only you know your user base, but he's probably right. Almost everyone for the most part everyone has Javascript enabled these days.</p> <p>I originally read about this in the Quora thread <a href="http://www.quora.com/Quora/Why-hasnt-Google-banned-Quora-for-hiding-answers-from-search-engine-visitors" target="_blank">here</a>. Quora Uses it to control what text Googlebot can index on their pages. If you want to see an example of it on my site, you can view one of our skills <a href="http://www.ixl.com/math/pre-k/identify-circles-squares-and-triangles" target="_blank">here</a>.</p></noscript>
-
RE: World Localities in AdWords?
You can actually look this up in Google Display Planner. Log into your Adwords account and go to Tools and Analysis > Display Planner. Select the topic for World Localities>San Antonio under the Individual targeting ideas tab. "Texas > San Antonio" should show up in the box in the upper right. Now, select The Placements subtab, and Sites. this will give you a list of sites Google thinks is relevant to this topic. The top ten are <a class="sMDB" title="sanantonio.com">sanantonio.com, </a><a class="sMDB" title="kens5.com (MojoPages.com, Multiple locations)">kens5.com, </a><a class="sMDB" title="beaumontenterprise.com">beaumontenterprise.com, </a><a class="sMDB" title="thesanantonioriverwalk.com">thesanantonioriverwalk.com, </a><a class="sMDB" title="tubetexas.com">tubetexas.com, </a><a class="sMDB" title="mywesttexas.com">mywesttexas.com, </a><a class="sMDB" title="riverwalkguide.com">riverwalkguide.com, </a><a class="sMDB" title="hillcountrycurrent.com">hillcountrycurrent.com, </a><a class="sMDB" title="zacktravel.com">zacktravel.com, </a><a class="sMDB" title="corsicanadailysun.com">corsicanadailysun.com.</a>
Of all Google's targeting options, I've had the least success with Topics. I'd recommend picking sites from the list that appears, and target them directly via Placements, or pages within those sites if they're not all about San Antonio. Topic targeting is specific to every page, so if you use it, your ads won't appear across all the domains above, but only on pages deemed by Google to be relevant to this topic. I just think it doesn't work that well yet. I've used it with success with other targeting options like interest categories. For example, you could pick an interest category for Travel, and then combine it with this topic to hit travelers reading about San Antonio. Pretty cool, right?
Also, be aware that the default location settings for Adwords will show ads to people around the world looking for San Antonio, as opposed to people specifically located in San Antonio. If that's important to you, make sure to expand the Advanced location settings in the campaign settings, and correct that.
-
RE: Multiple Remarketing Tag on a single web page?
I've been migrating to using Google Analytics for our remarketing lists (see here) to keep all these Adwords pixels from hanging around on my site. Unless you're using these for doing remarketing on the search network... the Google Analytics segments aren't supported for that yet, but I was told by my reps to expect that soon.
As Remus said, there's absolutely no problem with having multiple Adwords remarketing pixels on one page.
Best posts made by john4math
-
RE: Blog for SEO: embedded in the site or separate
Add it to your site. You want people to know the blog is part of the site, and you want people to be able to get from your site to the blog and vice versa easily. Also, you want your site's rankings to benefit from the traffic you bring in via the blog, and vice versa.
To make it be treated as part of your site, you should set it up under a URL like mysite.com/blog, vs. blog.mysite.com. The subdomain approach will get your blog treated like a new site.
-
RE: Is 404'ing a page enough to remove it from Google's index?
Setting pages to 404 should be enough to remove them after Google indexes your page enough times. Google has to be careful about this, because when many sites crash or have site maintenance, they return 404 instead of 503, so Google wouldn't want to remove pages from their index until they're sure the page is gone.
Google talks about removing pages from there index here. The Google Webmaster Tools URL removal tool is only intended for pages that urgently need to be removed, so I wouldn't recommend that. Google recommends:
- If the page no longer exists, make sure that the server returns a 404 (Not Found) or 410 (Gone) HTTP status code. This will tell Google that the page is gone and that it should no longer appear in search results.
- If the page still exists but you don't want it to appear in search results, use robots.txt to prevent Google from crawling it. Note that in general, even if a URL is disallowed by robots.txt we may still index the page if we find its URL on another site. However, Google won't index the page if it's blocked in robots.txt and there's an active removal request for the page.
- Alternatively, you can use a noindex meta tag. When we see this tag on a page, Google will completely drop the page from our search results, even if other pages link to it. This is a good solution if you don't have direct access to the site server. (You will need to be able to edit the HTML source of the page).
Is there a reason you are 404'ing these pages rather than redirecting them? If these pages have new pages with similar content, you should do a 301 redirect to keep the link juice flowing and to take advantage of these pages being linked to. If you do continue returning 404 for these pages (or even if you don't...), make sure your 404 page is a useful one, that helps users find the page they're looking for (Google help article).
Also, Ryan, I'd be interested in hearing the results of using the 410 status code. I would imagine that status code would do the trick! I'm surprised I haven't read about this more, or why it's not mentioned in the help file linked to above.
-
RE: Should me URLs be uppercase or lowercase
I like lowercase because when I type URLs by hand, I don't think to capitalize things. If you capitalize things, you have to get the casing right to make the URL valid (unless you're setting up all sorts of fancy redirects), otherwise you a 404 and are left scratching your head. Also, I agree with Dan that it looks better.
Hyphens vs. underscores is a classic question; Matt Cutts says to go with hyphens: http://www.youtube.com/watch?v=Q3SFVfDIS5k. I like that better too.
-
RE: How many jumps between 301 redirects is acceptable?
301 redirect chains are bad. If it's important to you to get the link juice to get from page A to page D, you should change it to just one redirect from A --> D. Matt Cutts talked to Rand about it on a whiteboard Friday http://www.seomoz.org/blog/whiteboard-interview-googles-matt-cutts-on-redirects-trust-more:
Is It a Bad Idea to Chain Redirects (e.g. 301-->301-->301)?
"It is, yeah."
Matt was very clear that Google can and usually will deal with one or two redirects in a series, but three is pushing it and anything beyond that probably won't be followed. He also reiterated that 302s should only be used for temporary redirects...but you already knew that, right?
-
RE: How does a canonical work and is it necessary to also have a no index, follow tag in place?
Ryan, spot on as always.
One other thing, it sounds like some of the canonicals you're placing on pages would be better suited to 301 redirects, like correcting a URL for not having a trailing slash or not. If you can avoid using canonicals and use 301 redirects instead, that's the preferred method for resolving duplicate content issues. Canonicals are more for when there are parameters on the URLs, and you can't get away from serving the pages with those parameters.
-
RE: Non US site pages indexed in US Google search
Here's all the things you can do to try geotarget your content for the search bots:
- Register each subfolder as a separate site in Google Webmaster Tools (e.g. example.com/ca/, example.com/us/), and geotarget it (see here).
- Set meta tags or http headers on each page to let Bing know the language and country (see here).
- For duplicate or near-duplicate pages across different English speaking localities, you can try out the hreflang tags to clue Google in that they're the same page, but geotargeting users in different locations. I haven't personally implemented this myself, so I can't speak to how well it works, but you can find more info about it hereand here.
Setting nofollows just stops PageRank from flowing, but bots can still follow these links, so I wouldn't do that.
-
RE: What audience size do you need for a successful retargeting campaign?
Sure, 10,000 people is enough to run a retargeting campaign.
You might try bringing up the bid to get some more impressions and see how it goes. 100 impressions isn't enough to tell much of anything. Retargeting does tend to have the highest ROI of any display advertising if you target it properly. Since the users in your retargeting list are more likely to convert on your site since they've been there before, you can tolerate a higher CPC with retargeting as compared to other display campaigns.
Are you segmenting your retargeting lists into many smaller lists? You can split your lists into users that have made it further down your conversion funnel, and bid higher the further they've made it down the funnel already (e.g. visited a subscription page and left, or added items to a shopping cart and abandoned it). If you're using Adwords and Google Analytics, you can do this easily with Google Analytics Remarketing, and retarget any visitors on your site based on things like time on site, pages they've visited, and custom events and goals they've accomplished (or not accomplished).
-
RE: tags inside <a>tags - is this bad?</a>
<spans>are all over the web, and used in lots of different situations. It shouldn't adverse affect your rankings.</spans>
That being said, going over your site and adding s into all your <a>s doesn't sound like fun... and after all you may want to change it again down the road. You can't accomplish something similar with CSS? I think styling your</a> <a>s with "display:block;" should accomplish the same thing as adding this to all your</a> <a>s?</a>
-
RE: High CTR, high CPC?
Go to your Keywords tab, and select that single keyword, Go to the Keyword details tab, and select "Auction Insights". This will give you a bunch of useful information:
- Impression share: How often you ad shows up
- Avg. position: The average position of your ad when it does show up
- Overlap rate: You can see how often your ad appears vs. other advertisers
- Position above rate: How often you're ranking above other advertisers.
- Top of page rate: How often you're at the top of the page (vs. the side).
So if your average position is already between 1-2, and your impression share is high, raising the bid probably won't do much. However, if you're not at the top of the page much, or your impression share is lower, you might be able to get more clicks by raising the bid.
-
RE: Multiple Remarketing Tag on a single web page?
I've been migrating to using Google Analytics for our remarketing lists (see here) to keep all these Adwords pixels from hanging around on my site. Unless you're using these for doing remarketing on the search network... the Google Analytics segments aren't supported for that yet, but I was told by my reps to expect that soon.
As Remus said, there's absolutely no problem with having multiple Adwords remarketing pixels on one page.
I'm sarcastic & easily sunburned. I enjoy good beer & trail running.
Looks like your connection to Moz was lost, please wait while we try to reconnect.