Anyone else looking at same issue or has feedback/comments?
thanks
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Anyone else looking at same issue or has feedback/comments?
thanks
Any ideas, any working with photographers using zenfolio? or sites with large amount of sitewide links point back??
A website of mine has about 4,000 backlinks of which 2,500 of them are coming from one website to the homepage and about 6 internal pages. These have been built up over about 5 years, mainly via article posts.
The site was recently hit via penguin 2.0 but has only had natural links built so i'm wondering if the sitewide links are in fact the issue?
The website linking to mine is an authority source within its niche but the concern is the amount of backlinks coming from this one site and if it may now be seen as having a negative impact. When ive reviewed the links from this one site via a backlink removal tool about 80% seem fine and suggestions are to remove about 20% of the backlinks.
Would you keep all the sitewide backlinks or remove them?
Have you come across a similar situation and how did it affect ranking/traffic?
We are using zenfolio as a hosted photography/image gallery set up as http://oursite.zenfolio.com
We have about 24,000 backlinks to the website however over 22,000 are from zenfolio.
Do you see issues with this set up from an organic seo perspective and so many links from one domain pointing back into the main site?
Thanks
As there are so many more good visual analytics tools coming on the market and visual comparison tools coming out, this is exactly the sort of format that people want to present their data to clients to easily understand.
So when you see the good visuals and data in the seomoz competitor report but have no way of extracting that its a bit frustrating.
For now i use alternative tools but hope seomoz implements some simple pdf export soon.
I am also looking to download the 'compare link metics' but doesn't appear to be a button for it?
Who would like to chip in with an answer to this please?
I think you are right in observations and thanks for response.
The url is generated when someone puts in a search for a location. it searches the items in that location then shows the embedded map with the flags plus the searched items.
Yes Google should index the url's. Long Url being generated as per example above. This is generated when someone puts in a location search and the location items appear.
Thanks
Jaz
I get the feeling i may just be having a conversation with myself??!!!
We are integrating Google Maps into a search feature on a website.
Would you use the standard dynamic generated long url that appears after a search or find a way of reducing this to a shorter url.
Taking into account hundreds of results.
Question asked for seo purposes.
Hi Sean,
Thanks for your response.
the urls' are different so i also thought this may be an issue.
But from reading in the webmaster guidlines if the sites are targeting a different international audience google may recognise this and distribute pages accordingly to a local audience rather than penalise.
I think the issue i have is that they are differnt url's and im concerned this maybe seen as duplication.
So xyz.co.nz and abc.co.uk but both using the same content on 1,000+ pages.
Id appreciate all input from anyone that reads or particularly anyone thats faced this scenario.
Thanks
Zak
We have a large site with 1,000+ pages of content to launch in the UK.
Much of this content is already being used on a .nz url which is going to stay. Do you see this as an issue or do you thin Google will take localised factoring into consideration.
We could add a link from the NZ pages to the UK. We cant noindex the pages as this is not an option.
Thanks
All I really wanted to know is if the semoz duplication tool carrys out its full purpose.
So I can see that seomoz identifies that there is duplicate content via the crawl diagnostics but then it seems not to show where these duplicate pages are? So it seems like the tool is only doing half of its job unless i am using it incorrectly?
Anyone any ideas if this tool identifies which are the duplicate pages?
Thanks
From my reports in seomoz i can see pages that are showing as having duplicate content but when i click on them it does not show me which pages are carrying the duplicate content?
Is there any way to check this via semoz reports?
If a website is moved to a new server, how can you ensure that all meta data moves with it?
We have a site that has an xml feed going out to many other sites.
The xml feed is behind a password protected page so cannot use a cannonical link to point back to original url.
How do we stop the pages being crawled on all of the sites using the xml feed? as with hundreds using it after launch it will cause instant duplicate content issues?
Thanks
The keyword should appear in the content but it does not have to be multiple times. First and last paragraph is sufficient.
Dont worry about meta keywords but do use titles and descriptions as these are what users see in the search engines. Dont duplicate these across pages.
Internal linking will improve the structure on your site as well as a sitemap.
Check out the seo moz beginners guide as everything should be in here http://www.seomoz.org/beginners-guide-to-seo
Hi Tina,
You think all xml feeds produce duplicate content to the sites they are going into?
The site the feed is coming from in this case is not neccessarily the most important one for this excercise. The hundred plus other sites that will use the feeds on their pages are all customers so working out the pro's and con's for them displaying a large amount of data via the xml feed in their pages.
I could use the canonical tag, this would have to be set up on original site that has feed.
The 100+ sites using the feed want to know if there pages can be indexed with the xml feed on to be found in the search engines, or is this duplicate without the canonical link?
For the purpose of this, the sites cannot create unique content.
If a site has an xml feed being used by 100 companies to create the content on their site. Will those 100 sites receive any link juice?
Is there any way content may be classed as duplicate across these sites? And should the page on the site where the xml feed is coming from have the page indexed first?