Is this spammy/panda problem?
-
We have a property site and have many area pages with listings of properties. For example, one page may have up to 30 properties listed, with 100 words of description on that listing page for that property and then you click 'more info' to get to the page of that property with maybe 200 words in total.
I want to add bullet points for each property on the area page but Im worried that it may be seen by google as spammy even though its usefull to the client.
For example, if I had 30 properties on a page, and 28 of them said next to each picture..
- Property Type: Shared, Open Plan, Single Bed
Would that be a problem for google?
-
Nope - I would always defer to what is better for the user. Remember that whilst there are many components of the algorithm that analyse the page there are also parts that look at engagement - if the changes have a positive impact on engagement and UX as suspected then I would not fear some algorithmic punishment.
Always, always test. Roll it out. Decide on your metrics and test the results by those measurements of what success looks like. If it has a negative impact on ranking or engagement then reconsider - you can always roll back.
It's very easy to get into analysis paralysis when worrying about the search ranking algorithm - do what is right by your users first and you won't go far wrong.
Hope that helps
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Items 30 - 50", however this is not accurate. Articles/Pages/Products counts are not close to this, products are 100+, so are the articles. We would want to either hide this or correct this.
We are running into this issue where we see items 30 -50 appear underneath the article title for google SERP descriptions . See screenshot or you can preview how its appearing in the listing for the site here: https://www.google.com/search?source=hp&ei=5I5fX939L6qxytMPh_el4AQ&q=site%3Adarbyscott.com&oq=site%3Adarbyscott.com&gs_lcp=CgZwc3ktYWIQAzoICAAQsQMQgwE6BQgAELEDOgIIADoECAAQCjoHCAAQsQMQClDYAljGJmC9J2gGcAB4AIABgwOIAYwWkgEIMjAuMy4wLjKYAQCgAQGqAQdnd3Mtd2l6sAEA&sclient=psy-ab&ved=0ahUKEwjd_4nR_ejrAhWqmHIEHYd7CUwQ4dUDCAk&uact=5 Items 30 - 50", however this is not accurate and we are not sure what google algorithm is counting. . Articles/Pages/Products counts are not close to this, products are 100+, so are the articles. Anyone have any thoughts on what google is pulling for the count and how to correct this? We would want to either hide this or correct this. view?usp=sharing
Web Design | | Raymond-Support0 -
Is Prerender.io/React going to negatively impact our SEO efforts?
On any page on the site (https://theadventurepeople.com/), the same short code appears. Having investigated Google index pages, Google's cache and Fetch & Render, it does look like Google can view the content and index it, but we're not 100% convinced. Background technical information from the web developer: The website is a single page application built using React. The site is setup with Prerender: https://prerender.io/ (which renders the javascript in a browser, saves the static HTML, and returns that to crawlers). Is Prerender.io/React going to negatively impact our SEO efforts?
Web Design | | Wagada0 -
Disallow: /sr/ and Disallow: /si/ - robots.txt
Hello Mozzers - I have come across the two directives above in a robots.txt file of a website - the web dev isn't sure what they meant although he implemented robots.txt - I think just legacy stuff that nobody has analysed for years - I vaguely recall sr means search request but can't remember. If any of you know what these directives do, then please let me know.
Web Design | | McTaggart0 -
Help with error: Not Found The requested URL /java/backlinker.php was not found on this server.
Hi all, We got this error for almost a month now. Until now we were outsourcing the webdesign and optimization, and now we are doing it in house, and the previous company did not gave us all the information we should know. And we've been trying to find this error and fix it with no result. Have you encounter this issue before? Did anyone found or knows a solution? Also would this affect our website in terms of SEO and in general. Would be very grateful to hear from you. Many thanks. Here is what appears on the bottom of the site( www.manvanlondon.co.uk) Not Found The requested URL /java/backlinker.php was not found on this server. <address>Apache/2.4.7 (Ubuntu) Server at 01adserver.com Port 80</address> <address> </address> <address> </address>
Web Design | | monicapopa0 -
Fixing my sites problem with duplicate page content
My site has a problem with duplicate page content. SEO MOZ is telling me 725 pages worth. I have looked a lot into the 301 Re direct and the Rel=canonical Tag and I have a few questions: First of all, I'm not sure which on I should use in this case. I have read that the 301 Redirect is the most popular path to take. If I take this path do I need to go in and change the URL of each of these pages or does it automatically change with in the redirect when I plug in the old URL and the new one? Also, do I need to just go to each page that SEO MOZ is telling me is a duplicate and make a redirect of that page? One thing that I am very confused about is the fact that some of these duplicates listed out are actually different pages on my site. So does this just mean the URL's are too similar to each other, and there fore need the redirect to fix them? Then on the other hand I have a log in page that says it has 50 duplicates. Would this be a case in which I would use the Canonical Tag and would put it into each duplicate so that the SE knew to go to the original file? Sorry for all of the questions in this. Thank you for any responses.
Web Design | | JoshMaxAmps0 -
RSS Submissions Positive/Negative/Neutral Impact...
We are looking to push our site content and blog content out to the masses... There are several sites and services that accept RSS feeds or submit RSS feeds to 50+ RSS sites. Have you seen an positive or negative impact of submitting feeds to these RSS directories? I am primarily looking at this for getting or content out and builing inbound links... Any thoughts or feedback would be appreciated... C
Web Design | | hireawizseo0 -
Does on page links have an effect on SERP rankings with PANDA
I have been doing some competitive analysis basing my company on others and have noticed a pattern. Very high ranking sites seem to have limited the internal and external on page links on their subdomains to under 100. my site has a lot of links but all are relevant and lead to unique content. I am interested to know if anyone else has noticed this pattern in changes in the SERP results. bIs google now penalizing pages with to many on site nav links? And if a full site restructure is needed to allow google to index and rank these pages or if a it is a non issue and does not need to be addressed. Panda confuses me!!!!! HELP!
Web Design | | Brother220 -
Should /dev folder be blocked?
I have been experiencing a ranking drop every two months, so I came upon a new theory this morning... Does Google do a deep crawl of your site say every 60-90 days and would they penalize a site if they crawled into your /dev area which would contain pretty the exact same urls and content as your production environment and therefore penalize you for duplicate content? The only issue I see with this theory is that I have been penalized only for specific keywords on specific pages, not necessarily across the board. Thoughts? What would be the best way to block out your /dev area?
Web Design | | BoulderJoe0