Yeah, but Ryan's answer is the best one if you can go that route.
Posts made by john4math
-
RE: Best practices for robotx.txt -- allow one page but not the others?
-
RE: Best practices for robotx.txt -- allow one page but not the others?
What you outlined sounds to me like it should work. Disallowing /searchhere? shouldn't disallow the top-level search page at /searchhere, but should disallow all the search result pages with queries after the ?.
-
RE: Trouble Exporting Xenu Crawl to Excel
I've never used Xenu before, but if you can save the export as a txt file, you can import that into Excel. Here are some sample directions I found.
-
RE: Best method to measure conversions (Adwords)
If you're tracking by using different phone numbers, you could set a cookie on these landing pages with which phone number is the default to use, and then have the rest of your pages display that phone number instead of your regular phone number. That way users will ever be exposed to one number, and you'll have tracking for those people who came to your site from a landing page.
All of this wouldn't be too difficult to implement. You could do it with jQuery and a jQuery cookie plugin.
-
RE: Broad Match Modifier: Should I use the + on the first keyword?
So what the plus does is enforce the word is used in the users query. For example, if you had:
- +buy +widgets
- buy +widgets
The first would enforce that both "buy" and "widgets" are in the query. This would match things like "buy cheap widgets", "buy widgets now", "i want to buy some widgets", etc.
The second would only enforce that "widgets" is in the query, and would allow Google to do broad matching against the word buy. The second could match the same queries as the first, but also additional queries like "purchase widgets", "get cheap widgets", "buying widgets", etc. Without the plus, Google can swap out the word "buy" for other similar words.
-
RE: What is the best approach to specifying a page's language?
I don't think so; it's still working for me. Here it is in cleartext: http://www.bing.com/community/site_blogs/b/webmaster/archive/2011/03/01/how-to-tell-bing-your-website-s-country-and-language.aspx
-
RE: GA and Ajax Forms
You can track events in Google Analytics by creating custom events as goodlegaladvice mentioned. Google's help file relating to this is here. You can also use these events as goals; see here.
-
RE: CSS Issue or not?
CSS positions things on the page, so if you remove it, it's not surprising that lots of elements overlap. The page isn't going to look good. This is nothing to worry about.
If you want to see how a page looks to search bots, view the source of the page, don't disable the CSS.
-
RE: How do you assess PPC ROI?
To be fair in your example, the Adwords click did (on average) have some effect on the ultimate sale. If the customer never came through the door looking for red widgets, they may not have purchased the blue widgets from you? There are a few different reports in Analytics that can help you see this:
- Conversions > Multi-channel funnels > Attribution Modeling Tool (I think this is available to everyone now). If last-click interaction is by far the most important for you, you can select this in the tool and select the primary dimension of Source/Medium. Drill into Google / cpc and you should see the data you're looking for there.
- Conversions > Multi-channel funnels > Top conversion paths. Select primary dimension Source/Medium Path. This will should you the different parts that Adwords played in each conversion. You can also pick a secondary dimension of Adwords Campaign Path to see more info.
-
RE: How do you assess PPC ROI?
I built my own Google Analytics custom reports which I export and add a few equations to in order to track ROI. I do it for a subscription site so it's not quite as complicated as it would be for an eCommerce site, where each purchase can have different items, margins, cost of goods sold, etc. What are you looking to see beyond paid search revenue to spend?
GA may be making this easier in 2013 with their "Universal Analytics" offering: http://analytics.blogspot.com/2012/10/google-analytics-summit-2013-whats-new.html.
-
RE: Strategy After Switching To HTTPS
Is it really necessary to have everything on https? What kind of information are you collecting on this lead form? Most people won't bat an eye about you collecting things like name and e-mail address on a non-secure connection, but it would depend on your customers and industry. For the visits you're getting now, are a higher percentage submitting the lead form because the pages are secure? If not, I'd flip the URLs back. https is slower and the browser won't be able to cache everything, so it makes the user suffer on every page load.
A few other thoughts (it's impossible to say without seeing your site), but having a lead form on every page may be overkill. You could just link to the form (and then have the form be secure), or you might find success collecting just the e-mail address on all of your pages, and when you collect it, have that go to a secure lead form. Then you'll have a chance to e-mail market to those who abandon the form. Something to think about and maybe A/B test.
-
RE: Why does SEOMoz think I have duplicate content?
These are duplicate pages. I'm not seeing a redirect from http://www.federalnational.com/About/tabid/82/Default.aspx to http://www.federalnational.com/about.aspx, nor is there a canonical tag telling the bot that http://www.federalnational.com/about.aspx is the canonical version.
Ideally, the platform could 301 redirect the first URL to the second. Failing that, you could place a canonical tag in the header of the page to point to the /about.aspx page.
Not relevant to the question but I also noticed in the automatically created version of the page, the active class isn't getting set on the <a>for "About" properly so it's not getting highlighted in the header.</a>
-
RE: Same Page, Multiple URLS- Canonical tag? Redirect?
You indeed have a duplicate content problem. The PageRank for the www.telikin.com/index.php page is not contributing towards the PageRank for your www.telikin.com page. You should 301 redirect www.telikin.com/index and www.telikin.com/index.php to www.telikin.com. Personally, I don't know how to write the actual redirect, but make sure when you do it that your home page works as you could potentially create an infinite redirect when messing with your site's index page.
The trailing slashes don't matter, meaning Google knows www.telikin.com/index.php/ is the same page as www.telikin.com/index.php, so you don't have to concern yourself with that.
-
RE: Best practices for switching site languages around
I think switching the two versions should be fine.
You can set up rel alternate hreflang tags if the content on your site is duplicate between English and Spanish. This will help the search engines understand that these are alternate pages based on the language, and help Spanish searches get the Spanish version of pages, and English searches get the English versions. Google talks about that here. You can do it on the page, or in your sitemap.
You can use the meta language tags or set the language in your HTTP headers for Bing to understand what's going on (see here).
-
RE: Google Remarketing Tag
It's an account level tag that works across any webpages on any domains it's included on. I've used remarketing pixels on microsites we run on separate domains, and within e-mails we send to target to those users with ads from our main site.
-
RE: Search Refinement URLs
The URL can be counted twice and this is duplicate content. If Google sees a link to that second URL with the parameters on the end, it won't assume it's the same page as the page with the URL that doesn't have the parameters.
A few things you can do to mitigate this:
- In Google Webmaster Tools and Bing Webmaster Tools, if those parameters never result in unique pages, you can tell each search engine to ignore each one of those parameters. If you ignore all of those parameters within webmaster tools, the second URL example above will resolve to the first URL in their eyes of each search engine.
- You can set a rel canonical on all of your pages to the root page URLs. Then no parameters will ever affect the indexing of your pages.
I'd be a little wary of setting rel canonical on all of my pages (I don't do this so I can't report if it works or not), so I'd personally opt for option 1 and not option 2, although theoretically each one should solve the problem.
-
RE: Multilingual site (rel="alternate" hreflang="x")
Have you registered each of these country-specific directories of your site within Google Webmaster Tools and set their geographic target? If not, that's a good place to start. Here is Google's directions for that. Maybe that along with the rel alternate hreflang tags will be enough to clean up Google's SERPs. You don't canonical tags on any of these pages as you want each of them appearing in SERPs depending on the country of the searcher. The meta language code is more for Bing to understand what's going on (see here). Google makes more of an effort to figure out the language of each page on their own. Here is their guide on multi-regional and multilanguage sites.
-
RE: When Google's WMT shows thousands of links from a single domain... Should they be removed?
Google just launched the disavow tool, and unless you've received one of those dreaded Google warnings, I'd be very wary of using it. Lots of sites have duplicate content issues or site-wide links, so it's not all that uncommon for a site to have 2,000 links to another site.
Personally, I wouldn't worry about it.
-
RE: Canonicalization of index.html - please help
Yes, this does create a duplicate content issue. The best solution is to have /index.html 301 redirect to /. However, the canonical as you outlined above should also to fix the issue if you don't have access to your server configuration for redirects.
-
RE: When Google's WMT shows thousands of links from a single domain... Should they be removed?
You can ask the webmaster of that site to correct the links so they go to valid pages on your site. Otherwise, I wouldn't bother with them. Just make sure you have a nice 404 page on your site so people who follow the links can find their way to your content without too much difficulty.
-
RE: Redirecting a Page from Domain A to Domain B
Do any pages on Domain A link to this page? Is it going to be off-putting or strange for users to get whisked off to Domain B when they follow these links? If so, is this a place where a cross-domain canonical might make more sense?
If you go the redirect route, do make sure to update all your internal links from Domain A to go directly to the page on Domain B as to not bleed any link juice by going through the redirect, and to help mitigate the surprise of changing domains (some people look at link destinations before clicking links).
-
RE: Duplicated Content with joomla multi language website
The Google Webmaster set up sounds right to me!
You should set the rel alternate on all pages that go back and forth, not just the English pages. That way if Google wants to return a Thai page to an English searcher, it'll know to reference the English page. This is the set up Google recommends in their help documentation.
Don't worry about a new sitemap for the /th/ pages. Your current set up should be fine.
-
RE: What is the best approach to specifying a page's language?
The article addresses this. Setting the language in the content-language meta tag will override the lang attribute in the html tag or the title tag. They're recommending to use one method to set the language and stick with it, rather than setting the language in multiple places on a page.
So, setting the lang on the tag is fine, just don't set meta content-language tags that specific a content, or set the lang in your <title>tags.</p></title>
-
RE: What is the best approach to specifying a page's language?
They talked about this on the Bing Webmaster blog awhile back. You can read about it here. There is a priority order. The article states:
Keep in mind that the priority order for these tags is: , , <title>. In other words, the document location set in the “content-language” meta tag will always supersede the document location indicated in the <html> or <title> tag. Its best that you use one option, instead of multiple options here.</span></p></title>
-
RE: Adding Meta Languange tag to xhtml site - coding help needed
The Bing Webmaster Central article where they discuss how to set the language for your pages is here.
-
RE: Duplicated Content with joomla multi language website
The proper way to handle this is with rel=alternate hreflang tags. This will tell Google the content is the same, but in different languages. See http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077 for more info. You can place meta tags on each page, or do it in your sitemap.
Other things you can do to help search engines get it right is to set up a profile in Google Webmaster Tools for each of the directories (or at least for the Thai one), and set the geotargeting. For Bing, they prefer you set the country and language on each page (see here).
If you block the pages with robots.txt or use canonical tags, you're telling Google not to include those pages in SERPs. It sounds like you want the Thai pages to appear in Thai results, and the English pages in English SERPs, so I wouldn't do that.
-
RE: How to Tell if an Image is Indexed
images.google.com used to be show up as a referrer, but Google changed that and lumped it in with organic search. So I don't think there's a good way to see it in GA anymore. Hopefully someone can correct me if I'm wrong!
I found one post where someone edited their GA script so it would start reporting images.google.com traffic separately: http://jrom.net/google-images-in-google-analytics. You might consider that if it's important to you.
-
RE: How to Tell if an Image is Indexed
Google images can search by image! Go to images.google.com, and click the little photo icon on the right of the search bar. You can search by image URL, or upload the image directly to Google to search.
-
RE: 404 in google webmaster tool
If you have new pages that are similar to your old pages, the old pages no longer need to exist. Usually this involves putting redirects in your conf file. I've never used Joomla, but I think the conf file would live outside of that (e.g. in Apache).
-
RE: Files blocked in robot.txt and seo
What you have there is just blocking rootdomain.com/javascript:void(0). Googlebot can execute and index JavaScript; you should not block it without a good reason. I'd let it read the JavaScript and see the submenus.
-
RE: File name same as folder name, ok?
Is the folder itself also a page? Why not just make domain.com/xyz-products/ serve the domain.com/xyz-products.php page? That seems like the most intuitive way for it to work. When people want to get back from a product page within the xyz-products directory to the top level page, they may just edit the URL and delete everything after that (I do this all the time).
If you want to stick with the structure you listed, it shouldn't have any issues other than what I mentioned above.
-
RE: 404 in google webmaster tool
You should do redirects for these pages if you can. The probable reason they keep showing up and getting crawled is because people linked to those pages before you redid your site, and Google is continuing to come across those links. If you 301 redirect them to their new counterparts, people will be getting to the content they're looking for when they click those links, and also the link juice from those links will pass to the new pages, so the new pages should also rank better.
-
RE: Files blocked in robot.txt and seo
If you don't want pages in those Disallowed directories to be indexed, then you're doing fine. These pages won't be able to be crawled, so, they won't be likely to appear in search results for any search engines.
The last three entries look fishy to me. I'd need to know what types of URLs you're trying to block to fix them. For the last one, if you're looking to block all pdfs on your site, the syntax would be Disallow: /*.pdf.
-
RE: Impact of SSL - switching from http: to https: on organic rankings?
What's your plan for the new pages? Are you going to support both http and https browsing of your site for all pages? If so Streamline Metrics answer is spot on.
Or are you switching all your pages from http to https? Do realize that having all secure pages means the browsers won't be able to cache anything from page to page, and makes page sizes slightly larger, so your page load times will suffer for every page view. This is a major reason why most sites are non-secure for as many pages as possible, and are only secure for pages that need it. I'd strongly consider doing this, or supporting both http and https (with canonical tags) if possible.
-
RE: tags inside <a>tags - is this bad?</a>
<spans>are all over the web, and used in lots of different situations. It shouldn't adverse affect your rankings.</spans>
That being said, going over your site and adding s into all your <a>s doesn't sound like fun... and after all you may want to change it again down the road. You can't accomplish something similar with CSS? I think styling your</a> <a>s with "display:block;" should accomplish the same thing as adding this to all your</a> <a>s?</a>
-
RE: A/B Testing. What do you use?
My preference is Optimizely. It does everything I need it to. I've run tests where I've changed multiple elements on multiple pages, and it's super simple to set everything up. We also test for multiple goals on our site with differing revenue, which they support seamlessly. You can try it out for free on their site without creating an account or anything, to see how you like it.
-
RE: Any providers offering a/b testing using JS callbacks?
The A/B testing provider I prefer is Optimizely. You can do this all within their interface. Their support is fantastic so if you have issues they can help you resolve them quickly, even if it's just help on how to set up a particular test.
They also have a 30-day trial and you can test out their editor on the spot without installing anything to see how it works, just by entering your URL on their home page.
-
RE: E-commerce Website, 1 year on, what are the next steps?
Take a look at the following recent posts and Q&A thread about content creation in "boring" niches (not to say cheese is boring). They seem relevant to your question with regards to generating content:
- http://www.seomoz.org/blog/companies-in-boring-niches-creating-great-content
- http://www.seomoz.org/blog/the-guide-to-developing-a-content-strategy-for-boring-industries
- http://www.seomoz.org/ugc/whiteboard-takeover-5-awesome-content-ideas-for-boring-niches
- http://www.seomoz.org/blog/define-and-align-a-manageable-content-and-social-media-marketing-process
- http://www.seomoz.org/q/example-of-a-local-boring-niche-site-in-a-relatively-high-competition-area-using-strictly-white-hat-tactics
-
RE: Wordwatch Software: PPC Adwords campaign managers heard of, tried, or actively using this?
I'd be interested in seeing your reviews as well.
-
RE: Need advice for indexing a multilingual website
First, make sure you set the geographic target in your Google Webmaster Tools if you have a specific geographic target. If not, Google recommends you don't set a geographic target it just for language purposes (see here). In that case, Google will figure out the language on its own (see here).
You have a few options for setting the language on the page, which is what Bing is looking for. They have a nice write-up about it here.
Each subdomain can have its own sitemap and robots.txt, so set those up like you would normally.
-
RE: /forum/ or /hookah-forum/
A good rule of thumb is to do what you think is better for your users, and not necessarily for SEO, as search engines are always moving in the direction of optimizing their users' experiences. Since those shorter URLs are easier for users, you've got your answer right there!
Since your domain is hookah.org, people will know that /blog/ and /forum/ are going to be related to hookahs, so there's no need to repeat the word in the URL. If the domain was more general or a brand, or had more than 1 blog or forum, then adding hookah to the URL would make sense.
-
RE: Question on regular expression for filters on GA
The ^ character makes it so your expression only matches the beginning of the request URIs. If you remove it, all URLs with /japan-english will get picked up. That sounds like it should work for you.
If you really just want those two instances, you can include both and separate them with a | which acts as "or" in regular expressions.
There's a nice guide for using regular expressions in GA at http://www.ppchero.com/analytics-regular-expression-characters/ if you want to read more about them.
-
RE: Ways to analyze a 1M rows dataset of search queries
Yeah, Access can process any number of rows. It's Microsoft's database program. You can upload data, and then create queries. They have a design view where you can construct queries in a WYSIWYG fashion, or if you want, you can write your own SQL.
-
RE: Ways to analyze a 1M rows dataset of search queries
I had a similar problem going through my search query reports. If you're already familiar with VB you could do this with a Microsoft Access database rather than setting up a MySQL one w/PHP. I've been working on creating an Access database that I can import my data into, and have it spit out all sorts of useful info (for example negative keywords and placements), but it's only in its early stages right now.
If you just want to see it for a few terms and don't mind doing it one at a time, in the past I've filtered data like this in Excel without VB using advanced filters. I found that using advanced filters rather than VB sped up the process quite a bit; I'd imagine because it's an innate Excel function. Using 4 filters you can match whole words in the queries. For example, to find queries with "blah", you'd set a filter for "blah", "* blah", "blah " and " blah *". Then you can use the Subtotal command to do calculations over the visible rows and calculate the data.
More about advanced filters: http://office.microsoft.com/en-us/excel-help/filter-by-using-advanced-criteria-HP005200178.aspx
-
RE: Adwords campaign and adgroup ideas
It's not bad necessarily. You want to keep your ad text and landing page relevant to the keywords in each ad group. So if you have a keyword in which you want a very specific ad to show up, that's fine. Otherwise, if you're ok with the same ad text and landing page showing up for a set of keywords, you can put all of those keywords in the same ad group.
Note that you can control max CPC bids on keywords, so if one isn't converting as well, or one is doing really well, you can change the bids at the keyword level.
-
RE: Adwords campaign and adgroup ideas
You should be using location targeting to those specific locations for your campaigns, and leaving the locations out of your keywords. So if someone in Henderson searches for "elementary private school", they'll see your ad.
For the Henderson campaign, the default Adwords setting will now show ads to people who search for these keywords in Henderson, or people who search for "Henderson NV elementary private school", which is probably what you want. If you'd prefer to exclude those searchers outside of Henderson, you can do that in your campaign settings as well with the Location > Target option, by selecting "People in my targeted location".
-
RE: What's the best thing to put in the footer on a PPC landing page?
I'd fix the misspelling "dowbloadable" and the extra space and period after it. I'm not sure what else you need in the footer.... having the privacy policy is good. You might consider making the "Dial800" there a link so people can easily find out more information about your company. There's a lot of good info on http://www.dial800.com/products/call-tracking that people might not find if they'd don't realize the header logo is a link and want more information but aren't ready to give you their info.
-
RE: What's the best thing to put in the footer on a PPC landing page?
It's hard to give an answer without seeing the page or knowing what you're selling. However, if you have many options, try A/B or multivariate testing different variations to see which converts the best.
-
RE: A site is not being indexed by Google Yahoo or Bing
The robots.txt (at http://adoptionconnection.org/robots.txt) is disallowing all bots from reading the site. Change it from:
User-agent: * Disallow: /
to:
User-agent: * Disallow:
-
RE: SEO Audit - Panda
Ryan Kent is #1 on the users board, and his answers that I've read in the pro Q&A are always right on. He's the director at Vitopian, and it sounds like they've been helping out sites with Panda and Penguin issues (he wrote a great Penguin-related post here).
He'd be the first person I'd look to for advice.