Wordtracker for keyword volume and Overture PPC for keyword value were my two go-to resources. And WebTrends for the painful process of attempting to figure out what was happening on-site.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by AlanBleiweiss
-
RE: Old school SEO tools / software / websites
-
RE: In the U.S., how can I stop the European version of my site from outranking the U.S. version?
Gianluca
Thanks for jumping in on this one. So if I'm reading your answer correctly, the bottom line here that there really should be one site per country, regardless of language spoken, correct?
-
RE: In the U.S., how can I stop the European version of my site from outranking the U.S. version?
Yeah inheriting previous work can be a challenge.
Since you are already planning on rolling out content in different languages, you will have not only the opportunity to set the hreflang tags for each, but also it will be important to ensure all of the content within each section is actually in that section's primary language for consistency. That too will help address the confusion Google has.
-
RE: Should I use **tags or h1/h2 tags for article titles on my homepage**
having the clearer understanding about the concept of having multiple "titles" on a single page (an H1 headline is the in-content "title" for that page), David is correct - while HTML 5 allows multiple H1 tags on a single page, this is bad because the H1 communicates "this is the primary topical focus of this unique page".
Because of that, if you have headlines within the content area for content elsewhere on the site, and link to that other content, then those are absolutely best served with H2 headline tags, or if not , then at the very least, "strong" tags if the topic of each target page is significantly different than the primary topic of the page they're all listed on.
-
RE: In the U.S., how can I stop the European version of my site from outranking the U.S. version?
Have you set the different hreflang tags appropriately across your content?
You said "US" and "European" - so does that mean you have just one set of content for all of Europe? If so, that can be more difficult to deal with, however if you set all of the US pages with an hreflang of "en-us" and the European pages with an hreflang of en-gb, you can at least help Google understand "this set is for the U.S. and this set is not".
What I always recommend if you're not targeting individual countries with your content (the "Europe" reference you made says you are not for that content), is to at the very least, split out content to two different domains. Have a .com domain for US content, and a separate .eu or .co.uk or .de or whatever other domain for your European content. That, while also setting up hreflang tagging, is really more helpful in communicating what should show up in which search results higher up.
You'll also need to accumulate inbound geo-relevant links to point to the appropriate content set to help reinforce this.
And if you split out domains, you can set country targeting more readily in Google Search Console.
For more info:
-
RE: What's the best possible URL structure for a local search engine?
In regard to shorter URLs:
The goal is to find a proper balance for your needs. You want to group things into sub-groups based on proper hierarchy, however you also don't want to go too deep if you don't have enough pages/individual listings deep down the chain.
So the Moz post you point to refers to that - at a certain point, having too many layers can be a problem. However there is one one single correct answer.
The most important thing to be aware of and consider is your own research and evaluation process for your situation in your market.
However, as far as what you found most people search for, be aware that with location based search, many people don't actually type in a location when they are doing a search. Except Google DOES factor in the location when deciding what to present in results. So the location matters even though people don't always include it themselves.
The issue is not to become completely lost in making a decision either though - consider all the factors, make a business decision to move forward with what you come up with, and be consistent in applying that plan across the board.
What I mean in regard to URLs and Breadcrumbs:
If the URL is www.askme.com/dehli/saket/pizza/pizza-hut/ the breadcrumb should be:
Home > Dehli > Saket > Pizza > Pizza Hut
If the URL is www.askme.com/pizza-huts/saket-delhi/ the breadcrumb should be
Home > Pizza Hut > Saket-Delhi
-
RE: What's the best possible URL structure for a local search engine?
Proximity to root is not a valid best practice, especially in this instance.
Here's why:
More people search based on geo-location than actual business name when looking for location based businesses. So by putting "Pizza Hut" first, that contradicts this notion. It implies "more people look for Pizza Hut than the number of people looking for all the different businesses in this geo-location".
Also, by using the URL you suggest, that's blatant over-optimization - attempting to stuff exact match keywords into the URL. In reality, people use a very wide range of keyword variations, so that's another conflict that harms your overall focus needs.
All of the individual factors need to reinforce each other as much as is reasonable for human readability. So URL, breadcrumb both should match the sequence. If one has one sequence, and the other has a different sequence, that confuses search algorithms.
-
RE: What's the best possible URL structure for a local search engine?
Local pack exists, yet is far from complete or consistently helpful. Business directories thrive even in an age of local packs. It's all about finding the best way to provide value, and the internet is large enough that many players can play in the game.
-
RE: What's the best possible URL structure for a local search engine?
Business listing directory environments have a big challenge when it comes to URL structure / information architecture and content organization because:
- Many businesses are searched for based on geo-location
- Many of those require hyper-local referencing while many others can be "in the general vacinity"
- Many other businesses are not as relevant to geo-location
So what is a site to do?
The best path is to recognize that as mobile becomes more and more critical to searcher needs, hyper-local optimization becomes more critical. It becomes the most important focus for SEO.
As a result, URL structure needs to reflect hyper-local first and foremost. So:
- www.askme.com/delhi/
- www.askme.com/delhi/saket/
- www.askme.com/delhi/saket/pizza/
- www.askme.com/dehli/saket/pizza/pizza-hut/
This way, if someone searches for "Pizza Hut Dehli", all of the Dehli Pizza Huts will show up, regardless of neighborhood, while anyone searching for "Pizza Hut Saket" will get more micro-locally relevant results.
And for those businesses that serve a wider geo-area, even though they too will be assigned a hyper-local final destination page, they will still be related to their broader geo-area as well. So someone searching "plumbers in Dehli" will get the right results and then they can choose any of the plumbers in Dehli regardless of what neighborhood they are in.
Note how I removed /search/ from the URL structure as well. It's an irrelevant level.
-
RE: How authentic is a dynamic footer from bots' perspective?
Nitin
You're dealing with multiple considerations and multiple issues in this setup.
First, it's a matter of link distribution. When you link to x pages from page 1, this informs search engines "we think these are important destination pages". If you change those links every day, or on every refresh, and if crawlers also encounter those changes, it's going to strain that communication.
This is something that happens naturally on news sites - news changes on a regular basis. So it's not completely invalid and alien to search algorithms to see or deal with. And thus it's not likely their systems would consider this black hat.
The scale and frequency of the changes is more of a concern because of that constantly changing link value distribution issue.
Either X cities are really "top" cities, or they are not.
Next, that link value distribution is further weakened by the volume of links. 25 links per section, three sections - that's 75 links. Added to the links at the top of the page, the "scrolling" links in the main content area of the home page, and the actual "footer" links (black background) so it dilutes link equity even further. (Think "going too thin" with too many links).
On category pages it's "only" 50 links in two sub-footer sections. Yet the total number of links even on a category page is a concern.
And on category pages, all those links dilute the primary focus of any main category page. If a category page is "Cell Phone Accessories in Bangalore", then all of those links in the "Top Cities" section dilute the location. All the links in the "Trending Searches" section dilute the non-geo focus.
What we end up with here then is an attempt to "link to all the things". This is never a best practice strategy.
Best practice strategies require a refined experience across the board. Consistency of signals, combined with not over-straining link equity distribution, and combined with refined, non-diluted topical focus are the best path to the most success long-term.
So in the example of where I said initially that news sites change the actual links shown when new news comes along, the best news sites do that while not constantly changing the primary categories featured, and where the overwhelming majority of links on a single category page are not diluted with lots of links to other categories. Consistency is critical.
SO - where any one or a handful of these issues might themselves not be a critical flaw scale big problem, the cumulative negative impact just harms the site's ability to communicate a quality consistent message.
The combined problem here then needs to be recognized as exponentially more problematic because of the scale of what you are doing across the entire site.