Help with Roger finding phantom links
-
It Monday and Roger has done another crawl and now I have a couple of issues:
- I have two pages showing 404->302 or 500 because these links do not exist. I have to fix the 500 but the 404 is trapped correctly.
http://www.oznappies.com/nappies.faq & http://www.oznappies.com/store/value-packs/\
The issue is when I do a site scan there is no anchor text that contains these links. So, what I would like to find out is where is Roger finding them. I cannot see any where in the Crawl Report that tells me where the origin of these links is.
- I also created a blog on Tumblr and now every tag and rss feed entry is producing a duplicate content error in the crawl stats. I cannot see anywhere in Tumblr to fix this issue.
Any Ideas?
-
Thanks again Ryan, you have been very helpful answering al lot of my questions.
-
Someone else asked the same question regarding tag pages yesterday. I would suggest asking a separate Q&A on that topic.
Tag pages & forum category pages are both often used as containers. They don't have any content except links to articles. I would ask for feedback as to the best practice. I suspect noindex, following those pages would be best, but I don't have the experience to feel comfortable offering that advice.
-
I have been looking at the data that Roger is reporting for the duplicate content and in ALL cases there is either a 301 or a NoIndex. So now I do not know why Roger is reporting them as a duplicate, robots should not see the second entry.
-
I did not think of looking at the csv report. I see it now thanks Ryan. There should be a soft 404 handler in place to process the bad urls, I will have to see why it is not working.
With tumblr, I was looking for an easy way to add a blog to the site.
The RSS is coming from tumblr as is all the content.
When we specify Tags in tumblr it creates urls e.g. mypage.com/article/tag1 mypage.com/article/tag2 mypage.com/article/tag3 which all contain the content of mypage.com/article with out a canonical to the original. It is a really strange non-seo friendly approach, and so I wondered if anyone had similar problems.
-
The crawl report offers a "referrer" field. That field offers where Roger found the offending link. In my experience that field has always been accurate.
When I try to access www.oznappies.com/faq I receive a 302 redirect and a 500 error. I would recommend adjusting non-existant pages to a soft 404 page. Still provide a 404 response to browsers, but offer users a friendly way to find information (i.e. links / search) and stay on your site.
A great example of a soft 404 page is http://www.orangecoat.com/a-404-page.html
For the Tumblr issue, I am not clear on the problem. Are you writing content and publishing on both the oznappies.com site and your tumblr site? Then this content is being published again on your site via a RSS import?
-
I removed the links and just left the text so these will cut and paste now. It confuses me where Roger found the links.
Thanks for running the Xenu scan. I have tried other site scanner and come up blank.
-
That second link is anchored to the wrong place.
Regardless I also cannot find the .faq page. I just ran Xenu over it to see what it could find, but no broken links showed up.
Afraid I don't use Tumblr either, so eh, pretty useless post. Sorry.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
DMOZ listing help
Is there someone with the wisdom to help me get my businesses listed on DMOZ? www.admiralmovers.com www.admiralrecordsmanagement.com
Moz Pro | | Smcnelley0 -
Inbound links not found on OSE
I have live inbound links for the site htp://vpnexpress.net, but OSE reports nothing found. How long does it usually take for them to index high PR site links? Thanks.
Moz Pro | | xvpn9020 -
help with the inbound links side of seomoz
Hi Can somebody help with the inbound links tool of seomoz, can you point me in the direction of how it works and the best practices to get the most from it. I know i have a lot more inbound links to the site in my campaign but its shows for example 114 links but when i click on show more it only shows 3 . am i doing something wrong? Any help and advice from how people use it , what for and what is best practice thanks.
Moz Pro | | Bristolweb0 -
Why would opensite explorer show a link from baidu result page with baidu as a linking root domain?
I found an old domain that has very high domain authority and one of its top linking root domains is baidu I clicked the link in ose and it took me to a baidu serp????? please explain...im happy to clarify if need be or give you the site in question just let me know what you think. I checked ose for the page authority of the serp page it was 50. would that page be helping the site in question to rank? or am I just dumb for asking this question One more thing I couldnt actually find a link to the page I was looking at in ose on the baidu result page.. thanks
Moz Pro | | duncan2740 -
OSE lists dead links
Going over the link profile of a competitor who gets 5x the traffic we do.... of course frustrated that the majority of their links are spam blogs (full of words but don't communicate anything) and forum profiles. Thanks Google for telling me what not to do, then rewarding my competitor for doing it shamelessly. Question regarding sites listed by Open Site Explorer as linking to said competitor, but that don't even load when I visit their url. Some go to a godaddy parked page, like the domain name expired long ago. Is this simply a limitation of OSE, and can I assume Google has indexed differently and therefore awarding no link juice from these urls?
Moz Pro | | jotham20 -
Internal link question - compared with my competitors
Dear all, I started to working on SEO on my website http://www.techstation.it and I have some thing that I would clarify. I see a lot of differences between internal links of my website and my competitors that usually are placed a bit better. (Please check attached image). They have A LOT of internal links, about 2.000+ instead 137 (mine), that's wrong in my SEOMOz dashboard. Every news and article page has 120-130 links to other contents. About teh external followed links: Mine: 1,674 #1: 2,018 #2: 316 (????) - this is placed really good for some competitive keywords #3 (best ranked): 7,905 links I have two more points that I would fix: Articles ranking We were one of the first 5-6 website that published at the launch day the review of Intel Core i7 3960X CPU and started with a ranking in the last first page results, a few days alter in the second and now we are at the end of the second page. This happens .. I think... every time 😕 All my article pages are graded with "A grade" in the control panel. So what should I do for increase my rankings? Thank you very much in advance! Alex
Moz Pro | | techstation0 -
Juicy Link Finder Working?
I cant seem to get this tool to work for any targeted keywords & amounts. Anyone else having this issue?
Moz Pro | | Anest0 -
Is the "Too Many Links" metric a blunt instrument?
The SEO Moz crawl diagnostics suggest that we have too many on-page links on several pages including on our homepage. How does your system determine when a page has too many links. It looks like when a page as more than 100 links it’s too many. Should the system take into account the page authority, domain authority, depth of page or other metrics? In addition, no-follow links are being included. As these are dropped from Google’s link graph, does it matter if we have too many? For example, we could have 200 links, 120 of which are no follow. Your tools would tell us we have too many links. Feedback appreciated. Donal
Moz Pro | | AdiRste0