OK Thanks for confirming this. Yup, it's the exact same text that would be at the bottom of all pages within that section. The text, as Google would see it (expanded) is very long and coming from a database of text, so images aren't an option in our case. AJAX would work though.
Posts made by boxcarpress
-
RE: Frequent FAQs vs duplicate content
-
Frequent FAQs vs duplicate content
It would be helpful for our visitors if we were to include an expandable list of FAQs on most pages. Each section would have its own list of FAQs specific to that section, but all the pages in that section would have the same text. It occurred to me that Google might view this as a duplicate content issue. Each page _does _have a lot of unique text, but underneath we would have lots of of text repeated throughout the site.
Should I be concerned? I guess I could always load these by AJAX after page load if might penalize us.
-
Excessive navigation links
I'm working on the code for a collaborative project that will eventually have hundreds of pages. The editor of this project wants all pages to be listed in the main navigation at the top of the site. There are four main dropdown (suckerfish-style) menus and these have nested sub- and sub-sub-menus. Putting aside the UI issues this creates, I'm concerned about how Google will find our content on the page. Right now, we now have over 120 links above the main content of the page and have plans to add more as time goes on (as new pages are created).
Perhaps of note, these navigation elements are within an html5
<nav>element:
<nav id="access" role="navigation">
Do you think that Google is savvy enough to overlook the "abundant" navigation links and focus on the content of the page below? Will the
<nav>element help us get away with this navigation strategy? Or should I reel some of these navigation pages into categories? As you might surmise the site has a fairly flat structure, hence the lack of category pages.</nav>
</nav>
</nav>
-
RE: Url structure for multiple search filters applied to products
Thanks for the links and the advice, Marcus.
I think after reading through the material I will meta noindex any search that has more than one search filter applied. So I'll index "blue" or "vintage" but not "vintage/blue" for instance. The most important top level search filters will become category pages, more or less. I'll try to tailor their content to reflect their importance. Thanks for your input!
-
Url structure for multiple search filters applied to products
We have a product catalog with several hundred similar products. Our list of products allows you apply filters to hone your search, so that in fact there are over 150,000 different individual searches you could come up with on this page. Some of these searches are relevant to our SEO strategy, but most are not.
Right now (for the most part) we save the state of each search with the fragment of the URL, or in other words in a way that isn't indexed by the search engines. The URL (without hashes) ranks very well in Google for our one main keyword. At the moment, Google doesn't recognize the variety of content possible on this page. An example is:
http://www.example.com/main-keyword.html#style=vintage&color=blue&season=spring
We're moving towards a more indexable URL structure and one that could potentially save the state of all 150,000 searches in a way that Google could read. An example would be:
http://www.example.com/main-keyword/vintage/blue/spring/
I worry, though, that giving so many options in our URL will confuse Google and make a lot of duplicate content. After all, we only have a few hundred products and inevitably many of the searches will look pretty similar. Also, I worry about losing ground on the main http://www.example.com/main-keyword.html page, when it's ranking so well at the moment.
So I guess the questions are:
-
Is there such a think as having URLs be too specific? Should we noindex or set rel=canonical on the pages whose keywords are nested too deep?
-
Will our main keyword's page suffer when it has to share all the inbound links with these other, more specific searches?
-
-
Are keywords rankings a zero sum game for our site?
We’ve made pages with resources for our customers. These pages have been well received and gotten us some good traffic. But these pages only target our main keywords tangentially. If we continue to build up pages like this -- that give us traffic from our customers, but don’t directly target our main keywords -- will our target keywords -- and the pages that focus on those keywords -- suffer?
Is it a zero sum game for our web site? Does increasing rankings and pages for certain keywords mean that other keywords / pages decrease as a result?