Potential issue: Page design might look like keyword stuffing to a web crawler
-
We have an interesting design element we might try on our home page. Here's a mockup: https://codepen.io/dsbudiac/pen/Bwrgjd
I'm worried web crawlers will interpret this as keyword stuffing and affect our rankings. It features:
- Mostly transparent/hidden text
- Repeating keyword list
I could try a couple methods to skirt around crawling concerns:
- Load keywords through an iframe
- Make the keywords an image (would significantly increase page load)
- Inject keywords after page load into a container w/ javascript (prob not effective as crawlers are only getting better at indexing javascript)
- Load the keywords into an svg element
- Load the keywords into a canvas element via javascript
I have a few questions:
- Should I be concerned about any potential keyword stuffing / SEO issues with this design?
- Can you comment on the effectiveness (with proof) of the above strategies?
- Am I better off just abandoning this type of design?
-
Ah, a very interesting question!
I'd not be too concerned; you're loading the content in through a data attribute rather than directly as text. However, there are definitely a few options you could consider:
- Render via SVG feels like the safest bet, though that's going to be a pretty large, complex set of vectors.
- Save + serve as an image (and overcome the file size concerns by using WebP, HTTP/2, a CDN like Cloudflare, etc)
- Serve the content via a dedicated JavaScript file, which you could block access to via robots.txt (a bit fudgey!)
I'd be keen to explore #2 - feels like you should be able to achieve the effect you're after with an image which isn't ridiculously huge.
-
Never said the image option was hard. It's just not ideal as it increases page load and is less flexible. A noindex'd iframe seems to be the best option. We already have a working proof of concept, thanks.
-
As long as you don't use that text inside a header, link, or some relevant piece of content you don't have to worry about it. As I understand h1 is the main factor of Google to determine the main keyword of a specific page.
-
I thought about using googleon/googleoff tags, but apparently that's only for Google Search Appliance, and not traditional google search/index: https://webmasters.stackexchange.com/questions/54735/can-you-use-googleon-and-googleoff-comments-to-prevent-googlebot-from-indexing-p
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Keep ranking homepage for target keyword, or switch to another page?
Hi Moz Community! I've researched Moz to find the answer to this question but nothing for my situation. I'm hoping some experienced SEOs can help me out. Here's the situation: I'm up against some fairly stiff competition for my main keyword - the front page is dominated by major manufacturers with high brand recognition and loads of money, where as my client is a much smaller manufacturer trying to compete. However, their DA is only 37-53 so not impossible to outrank... just many links and a significant advantage. We've honed in on a keyword that still drives good traffic, that's a great term to drive paying customers, and that we can get competitive with. My strategy was to attempt to rank my client's _homepage _for this term, rather than a specific product page, as I knew that they'd have many more links and social shares of their main site. (I've been successful with this strategy before). We've risen 60+ positions for the keyword in the past 3 months, to position 12, but we seem to have plateaued for the past month. We're ranking in top 5 positions for a number of our other keywords, so I know we're trending well. However, I'm concerned that despite our quick rise to #12, I may have made a seemingly fatal decision to rank their homepage for our target keyword term. After we had plateaued for a while, I did a more thorough side by side comparison and found that 8 out of 10 competitors on the front page have 2 main things we don't (and can't, because we're ranking the homepage)... 1- The keyword in the url (they're ranking for product pages, i.e. homepage.com/keyword-here/) 2- Their keyword comes first, or early in the meta title. Ours is _supposed to _, but as you know- Google can do what it likes with your homepage title as it's your brand, so they've put our company name- _then _the keyword we added in the title. e.g. Our Company | The Term We're Ranking For We've done a lot of work, and gained many reputable, high quality links, and we did see a significant rank increase across all our pages. My question is- did I shoot myself in the foot? Or is ranking the homepage still viable in this situation? If ultimately this is going to be impossible to get in the top #5 spots, what can I do to fix it? We've already gained a PA of 38 on the homepage from our work. Or would you let it go and just keep working at it, expecting that eventually we'll break onto the front page? Thanks in advance! Let me know if you need more info. I tried to be general with terms/site for my client's sake.
Intermediate & Advanced SEO | | TheatreSolutionsInc0 -
"Null" appearing as top keyword in "Content Keywords" under Google index in Google Search Console
Hi, "Null" is appearing as top keyword in Google search console > Google Index > Content Keywords for our site http://goo.gl/cKaQ4K . We do not use "null" as keyword on site. We are not able to find why Google is treating "null" as a keyword for our site. Is anyone facing such issue. Thanks & Regards
Intermediate & Advanced SEO | | vivekrathore0 -
Date of page first indexed or age of a page?
Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul
Intermediate & Advanced SEO | | MBASydney0 -
Interlinking from unique content page to limited content page
I have a page (page 1) with a lot of unique content which may rank for "Example for sale". On this page I Interlink to a page (page 2) with very limited unique content, but a page I believe is better for the user with anchor "See all Example for sale". In other words, the 1st page is more like a guide with items for sale mixed, whereas the 2nd page is purely a "for sale" page with almost no unique content, but very engaging for users. Questions: Is it risky that I interlink with "Example for sale" to a page with limited unique content, as I risk not being able to rank for either of these 2 pages Would it make sense to "no index, follow" page 2 as there is limited unique content, and is actually a page that exist across the web on other websites in different formats (it is real estate MLS listings), but I can still keep the "Example for sale" link leading to page 2 without risking losing ranking of page 1 for "Example for sale"keyword phrase I am basically trying to work out best solution to rank for "Keyword for sale" and dilemma is page 2 is best for users, but is not a very unique page and page 2 is very unique and OK for users but mixed up writing, pictures and more with properties for sale.
Intermediate & Advanced SEO | | khi50 -
An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
Intermediate & Advanced SEO | | Zanox
- The amount of 404 pages Google is warning about in GWT
- Right now all productpages are indexed
- Use as much value as possible in the right way from all pages
- Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.0 -
Merge content pages together to get one deep high quality content page - good or not !?
Hi, I manage the SEO of a brand poker website that provide ongoing very good content around specific poker tournaments, but all this content is split into dozens of pages in different sections of the website (blog section, news sections, tournament section, promotion section). It seems like today having one deep piece of content in one page has better chance to get mention / social signals / links and therefore get a higher authority / ranking / traffic than if this content was split into dozens of pages. But the poker website I work for and also many other website do generate naturally good content targeting long tail keywords around a specific topic into different section of the website on an ongoing basis. Do you we need once a while to merge those content pages into one page ? If yes, what technical implementation would you advice ? (copy and readjust/restructure all content into one page + 301 the URL into one). Thanks Jeremy
Intermediate & Advanced SEO | | Tit0 -
How many words in the same page creates keyword stuffing?
In the on page report indicates that the maximum is 15. What are the best? It includes keywords on title, description and images names?
Intermediate & Advanced SEO | | Naghirniac0