Bespoke Website With Lack of Front Page Content
-
Hey guys,
I wanted to ask you your opinion..
If you had a website - portfolio style for argument's sake and it was based on wordpress, obviously the front page won't be SEO friendly if you want to keep the minimalistic approach - there will be hardly any content to tell google what to rank your site for...
So my question is, can you use a plugin that Google can 'see' content - such as a long unique article - that the user can't see in order to help you rank? I.e. for Gbot, the plugin would load the content plugin as plain html, but 'hide' it from most people visiting the site...
What would you do in this scenario?
Your response would be much appreciated!
Thanks in advance for your help!
-
I would not recommend putting anything on your site that Google can see but not your customers. From- https://support.google.com/webmasters/answer/66353?hl=en
Hiding text or links in your content to manipulate Google’s search rankings can be seen as deceptive and is a violation of Google’s Webmaster Guidelines. Text (such as excessive keywords) can be hidden in several ways, including:
- Using white text on a white background
- Locating text behind an image
- Using CSS to position text off-screen
- Setting the font size to 0
- Hiding a link by only linking one small character—for example, a hyphen in the middle of a paragraph
-
Hi there
You can certainly make use of some design elements to help your website appear attractive, but so it still contains useful information for both your user and the search bots.
You might have seen this already with landing pages that contain a video. You might want to make the video the main focus of the page with little other content visible to direct attention to the video and subsequent call to action (CTA). However, what people have done in the past, with good success, is include the transcript in expandable content. There will be a bit of text saying "Click here for the transcript" and, upon clicking that text, an element will fire, a box will expand down and the content of the video is then displayed. The search bot can read the content at all times, which will help the page to rank (provided it is on-topic), while the text is hidden by default for user experience reasons. There are lots of variations of this that you might have already seen.
There are two things that I would always consider: 1) Make sure you are hiding the content purely for design/UX reasons and not nefarious reasons (such as hiding text on an irrelevant page/website) and 2) Make sure that the content is justifiably there on the page. I've seen some people "mask" their landing pages in they will show the search bots a long, informative article, but a user who visits the webpage will be redirected via javascript redirect (which bots can't follow) or will overlay the page with an iframe (which bots can't see) to show the user one thing but the bots something completely different. In my opinion, this goes too far and is manipulative, which might see your page and/or site penalised. With this, you are deliberately misleading the bots and/or users with what you want them to see. I would think that a plugin that injects content would possibly be seen the same way.
I suppose there is a bit of a grey area but in my earlier example it allows the users and the bots to see, if they so choose, the exact same page and layout. The difference is that you are styling some content so that the design and user experience is one that is tailored, but you are not restricting either the bot or the user from seeing the full landing page should they so wish. That (again, in my opinion) is acceptable.
So with that in mind I would look at incorporating design elements that allow the content to be displayed if wanted, but can be folded away by default. Expandable content, tabbed content, accordion menus - there are a number of ways you can do this. Try not to be deliberately misleading, but definitely design your web page around what you want your user to see.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does google treat dynamically generated content on a page?
I'm trying to find information on how google treats dynamically generated content within a webpage? (not dynamic urls) For example I have a list of our top 10 products with short product descriptions and links on our homepage to flow some of the pagerank to those individual product pages. My developer wants to make these top products dynamic to where they switch around daily. Won't this negatively affect my seo and ability to rank for those keywords if they keep switching around or would this help since the content would be updated so frequently?
Intermediate & Advanced SEO | | ntsupply0 -
How to speed indexing of web pages after website overhaul.
We have recently overhauled our website and that has meant new urls as we moved from asp to php. we also moved from http to https. The website (https://) has 694 urls submitted through site map with 679 indexed in sitemap of google search console. As we look through the google search console analytics we notice that google index section / index status it says: https://www.xyz.com version - index status 2
Intermediate & Advanced SEO | | Direct_Ram
www.xyz.com version - index status 37
xyz.com version - index status 8 how can we get more pages to be indexed or found by google sooner rather than later as we have lost major traffic. thanks for your help in advance0 -
Duplicate peices of content on multiple pages - is this a problem
I have a couple of WordPress clients with the same issue but caused in different ways: 1. The Slash WP theme which is a portfolio theme, involves setting up multiple excerpts of content that can then be added to multiple pages. So although the pages themselves are not identical, there are the same snippets of content appearing on multiple pages 2. A WP blog which has multiple categories and/or tags for each post, effectively ends up with many pages showing duplicate excerpts of content. My view has always been to noindex these pages (via Yoast), but was advised recently not to. In both these cases, even though the pages are not identical, do you think this duplicate content across multiple pages could cause an issue? All thoughts appreciated
Intermediate & Advanced SEO | | Chammy0 -
Duplicate content on the same page--is this an issue?
We are transitioning to responsive design and some of our pages will not scale properly, so we were thinking of adding the same content twice to the same URL (one would be simple text -- for mobile and the other would include the images, etc for the desktop version), and content would change based on size of the screen. I'm not looking for another technical solution (I know google specifies that you can dynamically serve different content based on user agent)--I am wondering if any one knows if having the same exact content appear twice on the same URL will cause a problem with SEO (any historical tests or experience would be great). Thank you in advance.
Intermediate & Advanced SEO | | nicole.healthline0 -
Websites with same content
Hi, Both my .co.uk and .ie websites have the exact same content which consists of hundreds of pages, is this going to cause an issue? I have a hreflang on both websites plus google webmaster tools is picking up that both websites are targeting different counties. Thanks
Intermediate & Advanced SEO | | Paul780 -
SEO-Friendly Method to Load XML Content onto Page
I have a client who has about 100 portfolio entries, each with its own HTML page. Those pages aren't getting indexed because of the way the main portfolio menu page works: It uses javascript to load the list of portfolio entries from an XML file along with metadata about each entry. Because it uses javascript, crawlers aren't seeing anything on the portfolio menu page. Here's a sample of the javascript used, this is one of many more lines of code: // load project xml try{ var req = new Request({ method: 'get', url: '/data/projects.xml', Normally I'd have them just manually add entries to the portfolio menu page, but part of the metadata that's getting loaded is project characteristics that are used to filter which portfolio entries are shown on page, such as client type (government, education, industrial, residential, industrial, etc.) and project type (depending on type of service that was provided). It's similar to filtering you'd see on an e-commerce site. This has to stay, so the page needs to remain dynamic. I'm trying to summarize the alternate methods they could use to load that content onto the page instead of javascript (I assume that server side solutions are the only ones I'd want, unless there's another option I'm unaware of). I'm aware that PHP could probably load all of their portfolio entries in the XML file on the server side. I'd like to get some recommendations on other possible solutions. Please feel free to ask any clarifying questions. Thanks!
Intermediate & Advanced SEO | | KaneJamison0 -
Concerns about duplicate content issues with australian and us version of website
My company has an ecommerce website that's been online for about 5 years. The url is www.betterbraces.com. We're getting ready to launch an australian version of the website and the url will be www.betterbraces.com.au. The australian website will have the same look as the US website and will contain about 200 of the same products that are featured on the US website. The only major difference between the two websites is the price that is charged for the products. The australian website will be hosted on the same server as the US website. To ensure Australians don't purchase from the US site we are going to have a geo redirect in place that sends anyone with a AU ip address to the australian website. I am concerned that the australian website is going to have duplicate content issues. However, I'm not sure if the fact that the domains are so similar coupled with the redirect will help the search engines understand that these sites are related. I would appreciate any recommendations on how to handle this situation to ensure oue rankings in the search engines aren't penalized. Thanks in advance for your help. Alison French
Intermediate & Advanced SEO | | djo-2836690 -
How do I fix the error duplicate page content and duplicate page title?
On my site www.millsheating.co.uk I have the error message as per the question title. The conflict is coming from these two pages which are effectively the same page: www.millsheating.co.uk www.millsheating.co.uk/index I have added a htaccess file to the root folder as I thought (hoped) it would fix the problem but I doesn't appear to have done so. this is the content of the htaccess file: Options +FollowSymLinks RewriteEngine On RewriteCond %{HTTP_HOST} ^millsheating.co.uk RewriteRule (.*) http://www.millsheating.co.uk/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://www.millsheating.co.uk/ [R=301,L] AddType x-mapp-php5 .php
Intermediate & Advanced SEO | | JasonHegarty0