Having Content be the First thing the bots see
-
If you have all of your homepage content in a tab set at the bottom of the page, but really would want that to be the first thing Google reads when it crawls your site, is there something you can implement where Google reads your content first before it reads the rest of your site? Does this cause any violations or are there any red flags that get raised from doing this? The goal here would just be to get Google to read the content first, not hide any content
-
it should only be the first line as h1, not the content. We styled it all the same so it didn't look silly. WE did make local cities h2....not sure if that's good or bad...but it stinks to serve so many cities and only rank at your physical location. Especially when there are 20 cities with in 20 miles here in DC metro.
Not sure if local "city pages" will work or how that changes the landing page experience verse a very interactive home page...Google didn't think about all of that!
-
Just checked how you have done it and I see what you mean - it's a bit tricky. One thing I noticed is that all that text is wrapped in a h1. I would take it out and put it in as standard content.
Also if you could take the text that is in your slideshow images and convert it to readable text that would provide you with a bit more relevant content on the site that may help.
Best of luck with it!
-
well....darn...its on the footer pretty much. Check out imageworksstudio.com
(about tab, lower left)
Thing is...you don't really want to spam up your site with content on a home page, as a branding firm we prefer short clear messaging that is focused on customer pain points, value props etc. Of course these are images and not really seo relevant anyways. Grrr - double edged sword.
Thanks again. I appreciate your comments.
-
It is done using CSS, but it needs to be clarified if the content is down far due to other content on the page or if it is down low due to HTML tags (perhaps from a navigation). The former might make a difference, but I think G can detect that trick anyway. The latter is irrelevant in my opinion, as the tags will be discounted.
-
There's been a bit of dicussion about this before and I seem to remember that using CSS to push content up the page actually had a slightly beneficial effect on rankings.
It's mainly going to be an issue if your content is really low down on the page due to things like intrusive banner ads or lots of adverts.
-
That's what I thought too....but I'm old school SEO and have no idea if this has changed! Thanks.
-
This can be done via CSS, but I'm not sure doing so has value any more. It used to be a practice a couple of years back, but I don't think it is necessary anymore.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Silo Architecture and Mobile First
This goes to the age-old SEO argument - how many links in the navigation. We are a well-known brick and mortar brand We have 20,000 SKUs and over 500 categories and sub-catetgories. 95%+ of our backlinks go to the home page. We don't have a blog, but it's in the works. Our site is not responsive. It serves up different versions based on device type, but is not an "M Dot". Our rankings are pretty strong in spite of a large number of technical SEO issues (different discussion). Currently, our e-commerce desktop site is "Siloed" (I'm new to the company - I didn't do it). The home page links via the top nav to categories. The category pages link to subcategories via sidebar navigation, or via images on the category pages (instead of product images). It's pretty close to textbook silos, and it's very near how I would have designed it. This silo architecture passes the most link juice to our categories which target our highest search volume (head) terms. The categories pass link juice (albeit significantly less) to our subcats which target secondary terms. In terms of search volume and commercial value, our tiers line up very neatly. On average, the targeted subcat terms get about 1/6 of the volume of our head terms. The Silo concept has been around forever, and is evangelized by Bruce Clay and other respected SEOs. Every time I've siloed an ecommerce site, the rankings improve dramatically, so who am I to argue? So, what's the problem? Read on... Our mobile navigation, on the other hand, links to every category and subcategory via flyout navigation (I didn't do this, either). In theory, this distributes an equal amount of link juice to all categories and subcategories. It robs link juice from our categories and passes it to subcategories. Right now, this isn't a problem. Rankings are based on the desktop site, and minor adjustments are made for mobile rankings. When Mobile First rolls out, our mobile nav will be the default navigation for Google, and in theory, link juice distribution across the site will change radically, and potentially harm our rankings for our head terms. I always study site architecture for a number of respected ecommerce sites. Target and Walmart, for example, link to every category and subcategory through their mobile and desktop navigation. Wayfair takes a silo approach on mobile and desktop, linking in tiers. I would argue that Walmart and Target have so much DA/TF/CF that they don't give a damn about targeted link juice distribution - it's all about UX. Wayfair's backlink profile is strong, but it's not Walmart or Target, so they need to be concerned about link juice distribution - hence the silo approach. Have the Google spokespeople said anything about this? I see this as a potential landmine across the industry. Is this something I should be concerned about? Has anyone had any experience with de-siloing a website? Am I making a big deal out of a non-issue? Please - no arguments about usability. UX is absolutely part of the equation. Usability is a ranking factor, but if our rankings and traffic take a nose dive, UX isn't going to matter. This is a theoretical discussion discussion on link juice distribution, and I know that compromises need to be made between SEO and UX.
Intermediate & Advanced SEO | | Satans_Apprentice0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
Duplicate Content Issues :(
I am wondering how we can solve our duplicate content issues. Here is the thing: There are so many ways you can write a description about a used watch. http://beckertime.com/product/mens-rolex-air-king-no-date-stainless-steel-watch-wsilver-dial-5500/ http://beckertime.com/product/mens-rolex-air-king-stainless-steel-date-watch-wblue-dial-5500/ Whats different between these two? The dial color. We have a lot of the same model numbers but with different conditions, dial colors, and bands.. What ideas do you have?
Intermediate & Advanced SEO | | KingRosales0 -
Does Google see this as duplicate content?
I'm working on a site that has too many pages in Google's index as shown in a simple count via a site search (example): site:http://www.mozquestionexample.com I ended up getting a full list of these pages and it shows pages that have been supposedly excluded from the index via GWT url parameters and/or canonicalization For instance, the list of indexed pages shows: 1. http://www.mozquestionexample.com/cool-stuff 2. http://www.mozquestionexample.com/cool-stuff?page=2 3. http://www.mozquestionexample.com?page=3 4. http://www.mozquestionexample.com?mq_source=q-and-a 5. http://www.mozquestionexample.com?type=productss&sort=1date Example #1 above is the one true page for search and the one that all the canonicals reference. Examples #2 and #3 shouldn't be in the index because the canonical points to url #1. Example #4 shouldn't be in the index, because it's just a source code that, again doesn't change the page and the canonical points to #1. Example #5 shouldn't be in the index because it's excluded in parameters as not affecting page content and the canonical is in place. Should I worry about these multiple urls for the same page and if so, what should I do about it? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
Question on Moving Content
I just moved my site from a Wordpress hosted site to Squarespace. We have the same domain, however, the content is now located on a different URL (again, same base domain). I'm unable to easily set up 301 redirects for the old content to be mapped to the new content so I was wondering if anyone had any recommendations for a workaround. Basically, I want to make sure google knows that Product A's page is now located at this new URL. (www.domain.com/11245 > www.domain.com/product-a). Maybe it's something that I don't have to worry about anymore because the old content is gone? I mean, I have a global redirect set up that no matter what you enter after the base domain, it now goes to the homepage but I just want to make sure I'm not missing something here. Really appreciate your help!
Intermediate & Advanced SEO | | TheBatesMillStore1 -
Google: How to See URLs Blocked by Robots?
Google Webmaster Tools says we have 17K out of 34K URLs that are blocked by our Robots.txt file. How can I see the URLs that are being blocked? Here's our Robots.txt file. User-agent: * Disallow: /swish.cgi Disallow: /demo Disallow: /reviews/review.php/new/ Disallow: /cgi-audiobooksonline/sb/order.cgi Disallow: /cgi-audiobooksonline/sb/productsearch.cgi Disallow: /cgi-audiobooksonline/sb/billing.cgi Disallow: /cgi-audiobooksonline/sb/inv.cgi Disallow: /cgi-audiobooksonline/sb/new_options.cgi Disallow: /cgi-audiobooksonline/sb/registration.cgi Disallow: /cgi-audiobooksonline/sb/tellfriend.cgi Disallow: /*?gdftrk Sitemap: http://www.audiobooksonline.com/google-sitemap.xml
Intermediate & Advanced SEO | | lbohen0 -
Different website is shown when searched for content
Hello, Have any body experienced this situation, When i take a content from http://www.creativethemes.net e.g. from home page and search it on google, search results shows FMEextensions with that content. FME had never ever used that data, it can not be found in sources files either, also this data was never written and placed on FME site but it is shown with FME How can this happen and what do you suggest to resolve it?
Intermediate & Advanced SEO | | MozAddict0