Is this spammy/panda problem?
-
We have a property site and have many area pages with listings of properties. For example, one page may have up to 30 properties listed, with 100 words of description on that listing page for that property and then you click 'more info' to get to the page of that property with maybe 200 words in total.
I want to add bullet points for each property on the area page but Im worried that it may be seen by google as spammy even though its usefull to the client.
For example, if I had 30 properties on a page, and 28 of them said next to each picture..
- Property Type: Shared, Open Plan, Single Bed
Would that be a problem for google?
-
Nope - I would always defer to what is better for the user. Remember that whilst there are many components of the algorithm that analyse the page there are also parts that look at engagement - if the changes have a positive impact on engagement and UX as suspected then I would not fear some algorithmic punishment.
Always, always test. Roll it out. Decide on your metrics and test the results by those measurements of what success looks like. If it has a negative impact on ranking or engagement then reconsider - you can always roll back.
It's very easy to get into analysis paralysis when worrying about the search ranking algorithm - do what is right by your users first and you won't go far wrong.
Hope that helps
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Magento 2.1 Multi Store / SEO
This is quite technical but I'm hoping a Magento expert can clear this up for me. Currently my company has two websites on separate Opencart platforms. What I'm doing now is building a Magento website and using the multi store function as well as a few modules to combine the two sites, the aim being that the link juice is shared and I can focus my SEO efforts on the one site instead of two, thus reducing my workload while maintaining the benefits. This is the intended layout: www.domain.com www.domain.com/us I have created a sub-folder (not a subdomain) as this seems to be the best way to share link juice between the new, combined sites (as well as 301s from the old, redundant site). At the moment I have created 2 separate websites, stores and store views (see attached) and have configured it according to the Magento guide, so I know that technically this is correct but I need to make sure that I have done it correctly in relation to SEO. Is the sub-folder set up correctly for instance? Currently the only files to populate that sub-folder are a htaccess, error log and index.php (see attached). Also, is there anything I could be missing in relation to SEO within the parameters of what I am trying to achieve? Additionally, only one store view appears in the "change store view" section of the home page. This is causing me to question if I have set it up correctly, because I had assumed both store views would appear even if they were under different websites (attached). OR do I simply use the same website and create two stores and store views? Do I also need to create a separate database for each website/store/store view? I would very much appreciate if someone could help out here. Thank you. In1Gi7t pyfM03y nUQoMz1
Web Design | | moon-boots0 -
Incorporating Spanish Page/Site
We bought an exact match domain (in Spanish) to incorporate with regular website for a particular keyword. This is our first attempt at this, and while we do have Spanish speaking staff that will translate/create a nice, quality page, we're not going to redo everything in Spanish page. Any advice on how to implement this? Do I need to create a whole other website in Spanish? Will that be duplicate content if I do? Can I just set it up to show the first page in Spanish, but if they click on anything else it redirects to our site? I'm pretty clueless on this, so if anything I've suggested is off-the-wall or a violation, I'm really just spit-balling, trying to figure out how to implement this. Thanks, Ruben
Web Design | | KempRugeLawGroup0 -
Https Implementation - Weird Redirection After Putting 's' in http://
Hi Mozers, I have come across some websites with their https version going to a totally different website. For example, http://www.samplesite1.com will load fine but when the protocol is changed to https (https://www.samplesite1.com) it will go to a total different domain say, https://www.samplesite2.com How does this happen, in technical sense? In the warning from browser, it says the the security certificate is from the other website but I would like to understand how this happens and how it impacts SEO. I seem to be not able to understand the relationship of this error and SEO impact. Thanks in advance for your response. Malika
Web Design | | Malika10 -
Responsive Site has "Not Found" Errors for mobile/ and m/ in Google Search Console
We have recently launched a new responsive website for a client and have noticed 2 "Not Found" errors within Google Search Console for /mobile and /m Both these URLs are not linked from anywhere within the site. However Google is reporting them as being linked from the homepage. This is not the first site we have seen in which Google has reported this error, however the other site was not a mobile friendly site. My thoughts are to 301 them back to the Homepage. Anybody else have any thoughts on this? or have recently received the same errors?
Web Design | | JustinTaylor881 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Is it common to have some of error/warning(currency duplicate,redirect, etc...) in most website that rank well?
Hi could any body could give me some idea on 'on page optimisation' Currently in my campaign I have around 3000+ errors, 14,000+ warning, 7000+ notices for the following reasons: Overly-Dynamic URL
Web Design | | LauraHT
Temporary Redirect
Title Element Too Long (> 70 Characters)
Duplicate Page Title
etc... First of all I know these have negative effect on SEO. Now to fix towards those issues it involve lots of works and times. At the same time most of our important keywords/url rank position have not changed over the last 12 months. Does that mean the above has only limited negative effect? I just want to know is it worthy to invest the man/hour/money to clean those issues. As it involves decent development time. Is it common to have some of error/warning in most website that rank well? (e.g. I 've seem may big website have duplicate title/meta-desc on their currency variant page)0 -
Multiple Sites, multiple locations similar / duplicate content
I am working with a business that wants to rank in local searches around the country for the same service. So they have websites such as OURSITE-chicago.com and OURSITE-seattle.com -- All of these sites are selling the same services, but with small variations in each state due to different legal standards in the state. The current strategy is to put up similar "local" websites with all the same content. So the bottom line is that we have a few different sites with the same content. The business wants to go national and is planning a different website for each location. In my opinion the duplicate content is a real problem. Unfortunately the nature of the service makes it so that there aren't many ways to say the same thing on each site 50 times without duplicate content. Rewriting content for each state seems like a daunting task when you have 70+ pages per site. So, from an SEO standpoint we have considered: Using the canonocalization tag on all but the central site... I think this would hurt all of the websites SERPs because none will have unique content. Having a central site with directories OURSITE.com/chicago -- but this creates a problem because we need to link back to the relevant content in the main site and ALSO have the unique "Chicago" content easily accessable to Chicago users while having Seattle users able to access their Seattle data. The best way we thought to do this was using a frame with a universal menu and a unique state based menu... Also not a good option because of frames will also hurt SEO. Rewrite all the same content 50 times. You can see why none of these are desirable options. But I know that plenty of websites have "state maps" on their main site. Is there a way to accomplish this in a way that doesn't make our copywriter want to kill us?
Web Design | | SysAdmin190 -
Should /dev folder be blocked?
I have been experiencing a ranking drop every two months, so I came upon a new theory this morning... Does Google do a deep crawl of your site say every 60-90 days and would they penalize a site if they crawled into your /dev area which would contain pretty the exact same urls and content as your production environment and therefore penalize you for duplicate content? The only issue I see with this theory is that I have been penalized only for specific keywords on specific pages, not necessarily across the board. Thoughts? What would be the best way to block out your /dev area?
Web Design | | BoulderJoe0