Adding breadcrumbs in the body of a page
-
We want to implement breadcrumbs to improve the usability of our website - if we manually input breadcrumbs into the body of every page via our CMS are there any negative effects?
-
Breadcrumbs add great value to the pages and helps pass PR around the site. However, since you are going around the CMS, you better keep the links under tight control to make sure the breadcrumbs don't eventually lead to 404 pages. Also, I strongly suggest to use the breadcrumb microformat for an enhanced search display in Google search results.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Business Services via Index Page or Dedicated Pages?
We're in the process of building a new website for our Business (B2B Event Services) and we've hit a design snag. Our designer wants to combine all of our business services information (there are six service lines) into a single Index Page titled "Services". Beyond this creating a needlessly long page to scroll through, I'm worried this will negatively impact our ability to SEO for each service line, as we wouldn't have any intention of letting people visit the individual pages from within the site. Our current site features individual pages which, in my opinion, is how we should build our new site. That being said, I'm completely open to any ideas that will further enhance usability and searchability. Any clarification would be greatly appreciated!
Web Design | | SHWorldwide1 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
Attachment Pages
i have hundreds/thousands of images on my site, but for some reason the images on this page - http://indigocarhire.co.uk/top-of-the-range-car-hire/ - are being flagged as attachment pages, meaning im getting errors for duplicate titles, missing metas ect why are these images and only these ones being flagged up, they have been added in exactly the same way as every other image on the site appreciate any advice Thanks
Web Design | | RGOnline0 -
Joomla ( title page override not working properly ) any techy guys out there
Hey Mozzers I am having some problems with joomla. I have tried many support forums and since everyone is in the same field as me, i thought this would be a great place to ask this question. I am working with joomla 2.5 and After i have turn on my search engine friendly configuration, you can override the ( alias ) of the page by providing page display options for title tag. so i turned on the SEF in global config and turn on the mod-rewrite and made sure my htaccess file was not txt. But i am having some problems with this.
Web Design | | BizDetox
On some pages the page display option for the _browser page title _works and on some it does. On the pages it doesnt it is pulling the information of the Alias. ( which is common with most site )
Why is it doing this You can check out the pages yourself Here is a page with it not working
http://tungstengem.com/mens-wedding-bands and here is a page with it working
http://tungstengem.com/mens-wedding-...-bands-for-men Also for my homepage when i didnt have my Apach rewrite it show the index.php and i was able to ad an alias to it. Now the Alias for the home page is not working0 -
Best way of conserving link juice from non important pages
If I have a bunch of non important pages on my website which are of little use in the SE's index - IE contact us pages, pages which are near duplicate and conflict with KW's targetting other pages etc, what is the best way of retaining the link juice that would normally be passed to these pages? Most recent discussion I have read has said that with nofollow you effectively just loose link juice, as opposed to conserving it, so that doesn't seem a great option. If I do "noindex" on these pages, would that conserve the link juice in the site, or again would it be just lost? It seems quite a tricky situation as many pages are legitimate for customer usability, but are not worth having in the SE's index and you better off consolidating link juice - so it seems you are getting penilised for making something "for users". Thanks
Web Design | | James770 -
Page Title or Search Friendly Urls?
We are currently auditing our website as part of our SEO strategy. One item which hascome up is the importance of search friendly urls against the search engine friendly page titles. Do url's or page titles carry more relevance than the other in search engines? Obviously the ideal would be to have both to maximise search impact but do either carry more importance. Thanks
Web Design | | bwfc770 -
Are slimmed down mobile versions of a canonical page considered cloaking?
We are developing our mobile site right now and we are using a user agent sniffer to figure out what kind of device the visitor is using. Once the server knows whether it is a desktop or mobile browser it will deliver the appropriate template. We decided to use the same URL for both versions of the page rather than using m.websiteurl.com or www.websiteurl.mobi so that traffic to either version of these pages would register as a visit to the page. Will search engines consider this cloaking or is mobile "versioning" an acceptable practice? The pages in essence are the same, the mobile version will just leave out extraneous scripts and unnecessary resources to better display on a mobile device.
Web Design | | TahoeMountain400