Having Content be the First thing the bots see
-
If you have all of your homepage content in a tab set at the bottom of the page, but really would want that to be the first thing Google reads when it crawls your site, is there something you can implement where Google reads your content first before it reads the rest of your site? Does this cause any violations or are there any red flags that get raised from doing this? The goal here would just be to get Google to read the content first, not hide any content
-
it should only be the first line as h1, not the content. We styled it all the same so it didn't look silly. WE did make local cities h2....not sure if that's good or bad...but it stinks to serve so many cities and only rank at your physical location. Especially when there are 20 cities with in 20 miles here in DC metro.
Not sure if local "city pages" will work or how that changes the landing page experience verse a very interactive home page...Google didn't think about all of that!
-
Just checked how you have done it and I see what you mean - it's a bit tricky. One thing I noticed is that all that text is wrapped in a h1. I would take it out and put it in as standard content.
Also if you could take the text that is in your slideshow images and convert it to readable text that would provide you with a bit more relevant content on the site that may help.
Best of luck with it!
-
well....darn...its on the footer pretty much. Check out imageworksstudio.com
(about tab, lower left)
Thing is...you don't really want to spam up your site with content on a home page, as a branding firm we prefer short clear messaging that is focused on customer pain points, value props etc. Of course these are images and not really seo relevant anyways. Grrr - double edged sword.
Thanks again. I appreciate your comments.
-
It is done using CSS, but it needs to be clarified if the content is down far due to other content on the page or if it is down low due to HTML tags (perhaps from a navigation). The former might make a difference, but I think G can detect that trick anyway. The latter is irrelevant in my opinion, as the tags will be discounted.
-
There's been a bit of dicussion about this before and I seem to remember that using CSS to push content up the page actually had a slightly beneficial effect on rankings.
It's mainly going to be an issue if your content is really low down on the page due to things like intrusive banner ads or lots of adverts.
-
That's what I thought too....but I'm old school SEO and have no idea if this has changed! Thanks.
-
This can be done via CSS, but I'm not sure doing so has value any more. It used to be a practice a couple of years back, but I don't think it is necessary anymore.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When "pruning" old content, is it normal to see an drop in Domain Authority on Moz crawl report?
After reading several posts about the benefits of pruning old, irrelevant content, I went through a content audit exercise to kick off the year. The biggest category of changes so far has been to noindex + remove from sitemap a number of blog posts from 2015/2016 (which were very time-specific, i.e. software release details). I assigned many of the old posts a new canonical URL pointing to the parent category. I realize it'd be ideal to point to a more relevant/current blog post, but could this be where I've gone wrong? Another big change was to hide the old posts from the archive pages on the blog. Any advice/experience from anyone doing something similar much appreciated! Would be good to be reassured I'm on the right track and a slight drop is nothing to worry about. 🙂 If anyone is interested in having a look: https://vivaldi.com https://vivaldi.com/blog/snapshots [this is the category where changes have been made, primarily] https://vivaldi.com/blog/snapshots/keyboard-shortcut-editing/ [example of a pruned post]
Intermediate & Advanced SEO | | jonmc1 -
Website Redesign - Duplicate Content?
I hired a company to redesign our website.there are many pages like the example below that we are downsizing content by 80%.(believe me, not my decision)Current page: https://servicechampions.com/air-conditioning/New page (on test server):https://servicechampions.mymwpdesign.com/air-conditioning/My question to you is, that 80% of content that i am losing in the redesign, can i republish it as a blog?I know that google has it indexed. The old page has been live for 5 years, but now 80% of it will no longer be live. so can it be a blog and gain new (keep) seo value?What should i do with the 80% of content i am losing?
Intermediate & Advanced SEO | | CamiloSC0 -
Dynamically Changing pages same content
Hey there Mozzers, I have a commerce site that is dynamically adding more products in the same page when you scroll down. I have added SEO Content on the footer of the page. The url is changing when you scroll to ?page-2, ?page-3 and so on. The content stays the same even though the page is dynamically changing. Is there a way to solve that issue? Should I always use canonical pointing to the initial page thus solving the duplication but indicate rel=next and rel=prev to the other pages etc? Thanks in advance
Intermediate & Advanced SEO | | AngelosS0 -
Best practice for expandable content
We are in the middle of having new pages added to our website. On our website we will have a information section containing various details about a product, this information will be several paragraphs long. we were wanting to show the first paragraph and have a read more button to show the rest of the content that is hidden. Whats googles view on this, is this bad for seo?
Intermediate & Advanced SEO | | Alexogilvie0 -
Content per page?
We used to have an articles worth of content in a scroll box created by our previous SEO, the problem was that it was very much keyword stuffed, link stuffed and complete crap. We then removed this and added more content above the fold, the problem I have is that we are only able to add 150 - 250 words above the fold and a bit of that is repetition across the pages. Would we benefit from putting an article at the bottom of each of our product pages, and when I say article I mean high quality in depth content that will go into a lot more detail about the product, history and more. Would this help our SEO (give the page more uniqueness and authority rather than 200 - 250 word pages). If I could see one problem it would be would an articles worth of content be ok at the bottom of the page and at that in a div tab or scroll box.
Intermediate & Advanced SEO | | BobAnderson0 -
Blog content - what to do, and what to avoid in terms of links, when you're paying for blog content
Hi, I've just been looking at a restaurant site which is paying food writers to put food news and blogs on their website. I checked the backlink profile of the site and the various bloggers in question usually link from their blogs / company websites to the said restaurant to help promote any new blogs that appear on the restaurant site. That got me wondering about whether this might cause problems with Google. I guess they've been putting about one blog live per month for 2 years, from 12/13 bloggers who have been linking to their website. What would you advise?
Intermediate & Advanced SEO | | McTaggart0 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0 -
Duplicate page Content
There has been over 300 pages on our clients site with duplicate page content. Before we embark on a programming solution to this with canonical tags, our developers are requesting the list of originating sites/links/sources for these odd URLs. How can we find a list of the originating URLs? If you we can provide a list of originating sources, that would be helpful. For example, our the following pages are showing (as a sample) as duplicate content: www.crittenton.com/Video/View.aspx?id=87&VideoID=11 www.crittenton.com/Video/View.aspx?id=87&VideoID=12 www.crittenton.com/Video/View.aspx?id=87&VideoID=15 www.crittenton.com/Video/View.aspx?id=87&VideoID=2 "How did you get all those duplicate urls? I have tried to google the "contact us", "news", "video" pages. I didn't get all those duplicate pages. The page id=87 on the most of the duplicate pages are not supposed to be there. I was wondering how the visitors got to all those duplicate pages. Please advise." Note, the CMS does not create this type of hybrid URLs. We are as curious as you as to where/why/how these are being created. Thanks.
Intermediate & Advanced SEO | | dlemieux0