E-commerce store, in need of protecting our own content
-
Dear other Moz fans,
We have an E-commerce store in Norway. Our main conversion to sale still happens in our physical store, but do to the description and information we provide online.
To warn you before you click; Our store is a boutique for "erotic items". A nice one how ever, made buy woman for woman and their man.We use enormous time writing descriptions and information for (almost) every item online.
We really want to protect our content (text information).What is the best practice to mark up "protection" of our hard work content?
Thank you for your time.
Regards form the Flirt girls in Norway. -
Thank you Tuzzel,
I will take a closer look at the article, there might be some ideas there. We have looked at the authorship options, but as you say. It's not what I'm looking for.
Thank you -
Thank you for your fast reply Remus,
But it's not what Im looking for I'm afraid. But still a wrong pointing url discovered, so thank youWe have been searching on rel=author, rel="publisher" and this is more blog related mark-ups. As far as we can see. Our Google+ page dont cover this either, due to that it is a page and not a profile.
I might to this much more complicated that it is... But it is worth a shot.
Monica
-
You have several options, while you can never stop someone coming to your site and actively taking your content you can attempt to trip them up, particularly if they are using automated tools like scrapers. There a are a few article out there (like this) that go into details but common recommendations you will see include things like adding links to your text and images that go to other pages in your site, often the sites stealing the content will then inadvertently include link back to you in their pages. To avoid issues of low quality link from these sources you should probably make these no follow to be safe. Then there is authorship etc. although that’s not quite right for product descriptions etc., though you could investigate the feasibility of this.
Other than that there is enforcing your copyright but to do so you need to locate the stolen content. Again multiple tools out there such as copyscape that Remus mentioned, but again a quick and easy one would be to set up Google alerts to look for that content. Then you can contact the webmasters and utilise DMCA takedown requests etc if necessary.
But if you are looking for methods to physically stop people taking your content im not aware of a fool proof one i am afraid.
Hope this is helpful.
-
Hello,
Maybe Copyscape? They even have a tool called Copysentry which monitors the web regularly for plagiarism.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need help with best practices on eliminating old thin content blogs.
We have about 100 really old blog posts that are nothing more than a short trip review w/ images. Consequently these pages are poor quality. Would best practices be to combine into one "review page" per trip, reducing from 100 to about 10 better pages and implement redirects? Or is having more pages better with less redirects? We only have about 700 pages total. Thanks for any input!
Intermediate & Advanced SEO | | KarenElaine0 -
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
Duplicate content. Competing for rank.
Scenario: An automotive dealer lists cars for sale on their website. The descriptions are very good and in depth at 1,200 words per car. However chunks of the copy are copied from car review websites and weaved into their original copy. Q1: This is flagged in copyscape - how much of an issue is this for Google? Q2: The same stock with the same copy is fed into a popular car listing website - the dealer's website and the classifieds website often rank in the top two positions (sometimes the dealer on top other times the classifieds site). Is this a good or a bad thing? Are you risking being seen as duplicating/scraping content? Thank you.
Intermediate & Advanced SEO | | Bee1590 -
SEO: How to change page content + shift its original content to other page at the same time?
Hello, I want to replace the content of one page of our website (already indexeed) and shift its original content to another page. How can I do this without problems like penalizations etc? Current situation: Page A
Intermediate & Advanced SEO | | daimpa
URL: example.com/formula-1
Content: ContentPageA Desired situation: Page A
URL: example.com/formula-1
Content: NEW CONTENT! Page B
URL: example.com/formula-1-news
Content: ContentPageA (The content that was in Page A!) Content of the two pages will be about the same argument (& same keyword) but non-duplicate. The new content in page A is more optimized for search engines. How long will it take for the page to rank better?0 -
Lazy Loading of products on an E-Commerce Website - Options Needed
Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Intermediate & Advanced SEO | | JBGlobalSEO
Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like: http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂0 -
Magento E-Commerce Crawl Issues
Hi Guys, First post here! I am responsible for a Magento e-commerce store and there are a few crawl issues and potential solutions that I am working and would like to get some advice to see if you agree with my approach. Old Product Pages - The majority of our stock is seasonal, therefore when a product sells out, it is not usually going to come back into stock. However the approach for Magento websites is to leave the page present but take the product off the category pages, so users can still find these pages from the search engines and they are orphaned pages as not linked to from elsewhere and not totally clear products are out of stock (just doesn't show the size pulldown or 'Add to Basket' button). There is no process in place to 301 redirect these pages either. My solution to this problem is to: 1. Change design of these pages so a clear message is shown to users that the product is out of stock and suggest related products to reduce bounce rates. I was also planning on having a link from an 'Out of Stock' page on the site to these products so they are orphaned but is this required do you think? 2. When I know for sure (e.g. over a month) that the product will not be returned (e.g. refund) by the user, then 301 redirect the product pages back to category page. How do other users 301 redirect their pages in Magento, I would like an easy to use system. Crawl Errors Identified in Google Webmaster Tools It seems in the last 2 weeks there has been a sharp increase in the number of soft 404 pages identified on the website. When I inspect these pages they seem to be categories and sub categories that no longer have any products in them. However, I don't want to delete these pages as new products might come in and go onto these category pages, therefore how should I approach this? A suggestion I have thought of is to put related products on to these pages? Any better ideas? Thanks, Graeme
Intermediate & Advanced SEO | | graeme19940 -
Need Perfect URLs
I'm redesigning a site's structure from the ground up, and am having issues with the URLs. I'd love to have them be perfect, but kept finding conflicting advice online. 1. For my services blog, is it best to have it set up like www.example.com/services/keyword or
Intermediate & Advanced SEO | | Stryde
www.example.com/keyword There seems to be conflicting advice as to keep it short and keep the keyword as far to the left as possible, but also that including the word services would help with long tail phrases and site organization. 2. For my blog section, is it best to have it set up like
www.example.com/blog/keyword or
www.example.com/keyword or
www.example.com/blog-post-title-with**-keyword**-in-it It's similar to the first question, but also adds the question of including the entire post title in the URL or just the keyword. Your help would be greatly appreciated!1 -
What to do when unique content is out of the question?
SEO companies/people are always stating that unique, quality content is one of the best things for SEO... But what happens when you can't do that? I've got a movie trailer blog and of late a lot of movie agencies are now asking us to use the text description they give us along with the movie trailer. This means that some pages are going to have NO unique content. What do you do in a situation like this?
Intermediate & Advanced SEO | | RichardTaylor0