Getting home page content at top of what robots see
-
When I click on the text-only cache of
nlpca(dot)com
on the home page
http://webcache.googleusercontent.com/search?q=cache:UIJER7OJFzYJ:www.nlpca.com/&hl=en&gl=us&strip=1
our H1 and body content are at the very bottom.
How do we get the h1 and content at the top of what the robots see?
Thanks!
-
For design purposes I never used it. Maybe a few times. Only for SEO I've really used it.
The code that you don't care about Google seeing first is what you put down on the bottom and absolute position it to the top so the website still looks the same.
-
Brent, I used to be a web designer. I was always told that absolute positioning was usually bad.
Also, is the code at the top that's seen first or is it the data that's absolutely positioned at the top?
-
Absolute Positioning will allow you to create a div at the bottom of the page, but display it at the top using css.
for example to have div2 display about div1, it would look like this in the code.
Google sees first.
People see firstcss code:
#div1 {width:900px;height:500px}
#div2 {position:absolute;top:-15px;width:900px;height:100px}I know it's not a very good example of code, but you should get the idea enough to start testing your layout. Also, go ahead and use this for testing. http://www.w3schools.com/cssref/tryit.asp?filename=trycss_position_absolute
-
That's OK Brent. I hope you enjoyed Mexico.
Yes, please explain more.
-
Sorry for the delay, I was in Mexico on vacation.
Did you get your answer or would you like me to explain more?
-
Could you say more Brent, I thought absolute positioning was something to stay away from from a CSS perspective. I'm open to your idea, I just need to know more.
-
Use absolute positioning to arrange the content in the order you want Search Engines to see first.
-
Hi there,
It will be due to the way theHTML is written. You will have to alter the HTML to reach your desired outcome. I think the HTML is rendered in the order that it appears on the page e.g. from top to bottom.
You have lots of menu options and these are listed first in the html page.
I also recommend a general review of things as your title is not appearing in the cached page either. Perhaps its because you are specifying it here as well:
name="title" content="NLP Training, Courses, Certification | NLP Institute of California" />
Hope this helps.
All the best!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content due to numerous sub category level pages
We have a healthcare website which lists doctors based on their medical speciality. We have a paginated series to list hundreds of doctors. Algorithm: A search for Dentist in Newark locality of New York gives a result filled with dentists from Newark followed by list of dentists in locations near by Newark. So all localities under a city have the same set of doctors distributed jumbled an distributed across multiple pages based on nearness to locality. When we don't have any dentists in Newark we populate results for near by localities and create a page. The issue - So when the number of dentists in New York is <11 all Localities X Dentists will have jumbled up results all pointing to the same 10 doctors. The issue is even severe when we see that we have only 1-3 dentists in the city. Every locality page will be exactly the same as a city level page. We have about 2.5 Million pages with the above scenario. **City level page - **https://www.example.com/new-york/dentist - 5 dentists **Locality Level Page - **https://www.example.com/new-york/dentist/clifton, https://www.example.com/new-york/dentist/newark - Page contains the same 5 dentists as in New York city level page in jumbled up or same order. What do you think we must do in such a case? We had discussions on putting a noindex on locality level pages or to apply canonical pointing from locality level to city level. But we are still not 100% sure.
Technical SEO | | ozil0 -
Take a good amount of existing landing pages offline because of low traffic, cannibalism and thin content
Hello Guys, I decided to take of about 20% of my existing landing pages offline (of about 50 from 250, which were launched of about 8 months ago). Reasons are: These pages sent no organic traffic at all in this 8 months Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content) Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50.. I also realized that for some keywords the landing page dropped out of the top 50, another landing page climbed from 50 to top 10 in the same week, next week the new landing page dropped to 30, next week out of 50 and the old landing pages comming back to top 20 - but not to top ten...This all happened in October..Did anyone observe such things as well? That are the reasons why I came to the conclustion to take these pages offline and integrating some of the good content on the other similiar pages to target broader with one page instead of two. And I hope to benefit from this with my left landing pages. I hope all agree? Now to the real question: Should I redirect all pages I take offline? Basically they send no traffic at all and non of them should have external links so I will not give away any link juice. Or should I just remove the URL's in the google webmaster tools and take them then offline? Like I said the sites are basically dead and personally I see no reason for these 50 redirects. Cheers, Heiko
Technical SEO | | _Heiko_0 -
How to make my good sub-page rank ahead of my generic home page?
I have an ecommerce site for the clothes drying racks my family business makes, and it sells a few other laundry items also. It's about 5 years old. We used to rank on the first page for basic phrases like "clothes drying rack" and "umbrella clothesline". About 1.5 years ago we fell hard in the rankings. Since then "umbrella clothesline" has moved back to the first page, but "clothes drying rack" is stuck on the 3rd page and always with the result being the generic homepage instead of the good sub-page (which used to rank on the first page) that really shows-n-tells about our drying rack. Here are the three pages I am talking about. Home page = http://www.bestdryingrack.com/ Drying rack page = http://www.bestdryingrack.com/clothes-drying-rack-main.html and umbrella clothesline page = http://www.bestdryingrack.com/umbrella-clotheslines.html Any ideas on how to get the drying rack page to start ranking well again? (hopefully better than the generic homepage ranks) A little technical background: the Moz campaign on this site says that the home page has a PA = 42 with 190 LRD's and 344 external links. Both the umbrella clothesline page and the clothes drying rack page have almost equal statistics of PA = 35 with 20 LRD's and 23 external links. My anchor text distribution is maybe unbalanced. The drying rack page has 15 external links with the anchor of "Clothes Drying Rack". But the umbrella clothesline page has 14 external links with the anchor of "outdoor umbrella clothesline" and it ranks on the first page for that search. I can't figure out how to get OSE to tell me anchor text stats for just the homepage and not the whole site since www.bestdryingrack.com/index.html 301's to the plain www.bestdryingrack.com (if you know how, please share) What's wrong with my poor neglected clothes drying rack page? The only way I can get it to show up on the first page is to do a real specific search like "round wooden clothes drying rack" Your help could save a faltering family business. Thank you!
Technical SEO | | GregB1230 -
Do multipe empty search result pages count as duplicate content?
I am writing an online application that among other things allows the users to search through our database for results. Pretty simply stuff. My question is this. When the site is starting out, there will probably be a lot of searches that will bring back empty pages since we will still be building it up. Each page will dynamically generate the title tags, description tags, H1, H2, H3 tags - so that part will be unique - but otherwise they will be almost identical empty results pages until then. Would Google Count all these empty result pages as duplicate content? Anybody have any experience with this? Thanks in advance.
Technical SEO | | rayvensoft0 -
Duplicate Page Content
Hi, I just had my site crawled by the seomoz robot and it came back with some errors. Basically it seems the categories and dates are not crawling directly. I'm a SEO newbie here Below is a capture of the video of what I am talking about. Any ideas on how to fix this? Hkpekchp
Technical SEO | | mcardenal0 -
Container Page/Content Page Duplicate Content
My client has a container page on their website, they are using SiteFinity, so it is called a "group page", in which individual pages appear and can be scrolled through. When link are followed, they first lead to the group page URL, in which the first content page is shown. However, when navigating through the content pages, the URL changes. When navigating BACK to the first content page, the URL is that for the content page, but it appears to indexers as a duplicate of the group page, that is, the URL that appeared when first linking to the group page. The client updates this on the regular, so I need to find a solution that will allow them to add more pages, the new one always becoming the top page, without requiring extra coding. For instance, I had considered integrating REL=NEXT and REL=PREV, but they aren't going to keep that up to date.
Technical SEO | | SpokeHQ1 -
Duplicate content with "no results found" search result pages
We have a motorcycle classifieds section that lets users search for motorcycles for sale using various drop down menus to pick year-make-type-model-trim, etc.. These search results create urls such as:
Technical SEO | | seoninjaz
www.example.com/classifieds/search.php?vehicle_manufacturer=Triumph&vehicle_category=On-Off Road&vehicle_model=Tiger&vehicle_trim=800 XC ABS We understand that all of these URL varieties are considered unique URLs by Google. The issue is that we are getting duplicate content errors on the pages that have no results as they have no content to distinguish themselves from each other. A URL like:
www.example.com/classifieds/search.php?vehicle_manufacturer=Triumph&vehicle_category=Sportbike
and
www.example.com/classifieds/search.php?vehicle_manufacturer=Honda&vehicle_category=Streetbike Will have a results page that says "0 results found". I'm wondering how we can distinguish these "unique" pages better? Some thoughts:
-make sure <title>reflects what was search<br />-add a heading that may say "0 results found for Triumph On-Off Road Tiger 800 XC ABS"<br /><br />Can anyone please help out and lend some ideas in solving this? <br /><br />Thank you.</p></title>0 -
Block or remove pages using a robots.txt
I want to use robots.txt to prevent googlebot access the specific folder on the server, Please tell me if the syntax below is correct User-Agent: Googlebot Disallow: /folder/ I want to use robots.txt to prevent google image index the images of my website , Please tell me if the syntax below is correct User-agent: Googlebot-Image Disallow: /
Technical SEO | | semer0