XML Feed
-
If a site has an xml feed being used by 100 companies to create the content on their site. Will those 100 sites receive any link juice?
Is there any way content may be classed as duplicate across these sites? And should the page on the site where the xml feed is coming from have the page indexed first?
-
Hi Tina,
You think all xml feeds produce duplicate content to the sites they are going into?
The site the feed is coming from in this case is not neccessarily the most important one for this excercise. The hundred plus other sites that will use the feeds on their pages are all customers so working out the pro's and con's for them displaying a large amount of data via the xml feed in their pages.
I could use the canonical tag, this would have to be set up on original site that has feed.
The 100+ sites using the feed want to know if there pages can be indexed with the xml feed on to be found in the search engines, or is this duplicate without the canonical link?
For the purpose of this, the sites cannot create unique content.
-
I would say that that they wouldn't be receiving any and that the content would be seen as duplicate because it is the same as on other sites.
If the site that the feed is coming from is the site that is important to you, then you want that to be indexed first.
I think you can use the canonical tag on the sites with the duplicate content to show the original source.
Can the other sites not create unique content to make them more useful?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap.xml Site multilang
HI all, I have some questions about multilang sitemap.xml. So, we use the same domain subdirectories with gTLDs example.com/pt-br/
Technical SEO | | mobic
example.com/us/
example.com/es/ How should I do the sitemap.xml in this case? I thought of three alternatives: Should I do a sitemap_index.xml to each lang and make categories for these sitemaps? Examples:
http://www.example.com/pt-br/sitemap_index.xml
http://www.example.com/en/sitemap_index.xml
http://www.example.com/es/sitemap_index.xml Should I do only one sitemap_index.xml covering all categories of all languages ? Examples:
http://www.example.com/sitemap_index.xml
http://www.example.com/pt-br/sitemap_categorias_1.xml
http://www.example.com/es/sitemap_categorias_1.xml
http://www.example.com/us/sitemap_categorias_1.xml Should I do a sitemap setting all multilang? <url><loc>http://www.example.com/us/</loc>
<xhtml:link <br="">rel="alternate"
hreflang="es"
href="http://www.example.com/pt-br/"
/>
<xhtml:link <br="">rel="alternate"
hreflang="us"
href="http://www.example.com/us/"
/>
<xhtml:link <br="">rel="alternate"
hreflang="pt-br"
href="http://www.example.com/pt-br/"
/></xhtml:link></xhtml:link></xhtml:link></url> Thanks for any advice.0 -
Xml sitemaps giving 404 errors
We have recently made updates to our xml sitemap and have split them into child sitemaps. Once these were submitted to search console, we received notification that the all of the child sitemaps except 1 produced 404 errors. However, when we view the xml sitemaps in a browser, there are no errors. I have also attempted crawling the child sitemaps with Screaming Frog and received 404 responses there as well. My developer cannot figure out what is causing the errors and I'm hoping someone here can assist. Here is one of the child sitemaps: http://www.sermonspice.com/sitemap-countdowns_paged_1.xml
Technical SEO | | ang0 -
Sitemap_index.xml = noindex,follow
I was running a rapport with Sreaming Frog SEO Spider and i saw: (Tab) Directives > NOindex : https://compleetverkleed.nl/sitemap_index.xml/ is set on X-Robots-Tag 1 > noindex,follow Does this mean my sitemap isn't indexed? If anyone has some more tips for our website, feel free to give some suggestions 🙂 (Website is far from complete)
Technical SEO | | Happy-SEO2 -
Is there a way for me to automatically download a website's sitemap.xml every month?
From now on we want to store all our sitemap.xml over the next years. Its a nice archive to have that allows us to analyse how many pages we have on our website and which ones were removed/redirected. Any suggestions? Thanks
Technical SEO | | DeptAgency0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
Best practice for XML sitemap depth
We run an eCommerce for education products with 20 or so subject based catalogues (Maths, Literacy etc) and each catalogue having numerous ranges (Counting, Maths Games etc) then products within those. We carry approximately 15,000 products. My question is around the sitemap we submit - nightly - and it's depth. It is currently set to cover off home, catalogues and ranges plus all static content (about us etc). Should we be submitting sitemaps to include product pages as well? Does it matter or would it not make much difference in terms of search. Thanks in advance.
Technical SEO | | TTS_Group0 -
SEO tips for RSS feeds?
What SEO advice do you have for RSS feeds? Specifically, does the URL structure matter? Should the be noindex, follow or noindex, follow? Any other advice?
Technical SEO | | nicole.healthline0 -
How to Submit XML Site Map with more than 300 Subdomains?
Hi,
Technical SEO | | vaibhav45
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.0