How to find internal pages linking to a URL?
-
Hey, I had an issue where a client found a bad link on their site then I went to fix it and couldn't figure out where on earth it was. I tried using different software which would find the link, but not tell me where it was linked from. I asked for some help from someone in my office and they found it in about 15 seconds. Their strategy was "think like a client - just click everywhere".
Is there a way to quickly find what URLs are pointing to a specific URL?
Cheers
-
If you download the CSV report, it should list the referring URL.
-
Is there really no way to find this within Moz's tools? If I'm looking at a 404 or a duplicate page, I should be able to track back to the page(s) that the link is coming from.
--Jeff
-
Ah yes - it is easy to miss where that tab is - glad you found it!
-
Sweet, cheers! Definitely worth a read, I don't really know the power of these two other than checking for 404s.
-
And here's a post that talks about Screaming Frog and Xenu and some of the wonderful things they can help you with, for future reference. http://moz.com/blog/crawler-faceoff-xenu-vs-screaming-frog
-
Cheers CleverPhD, I used Screaming Frog but couldn't see where it shows this. On checking again I see there is a tab at the very bottom called "In Links" which shows all the links pointing to the select URL. Awesome!
-
Use a tool like the Screaming Frog SEO Spider. It will show you the link and all the page(s) it is on. If you find nothing, you have a spreadsheet showing that the link is not present. Remind the client that next time if they can possibly get a screen shot that would really help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Judge Duplicate Content on Responsive Pages to be Keyword Spamming?
I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
Web Design | | CurtisB
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!0 -
Shortened URL is breaking when URL is in Upper Case
Hi there, Currently I'm having some troubling mitigating an odd occurrence with some redirected shortened URLs being in upper case. Here is how they should be behaving - www.rhinosec.com/webapp -> https://rhinosecuritylabs.com/landing/sample-report-webapp-pentest/
Web Design | | BCaudill
www.rhinosec.com/network -> https://rhinosecuritylabs.com/landing/sample-report-network-pentest/
www.rhinosec.com/se -> https://rhinosecuritylabs.com/landing/social-engineering-example-report/ but when the /______ is capitalized - for example - WEBAPP, NETWORK, SE; WordPress either gives me a 404 or guesses the pages and lands on: NETWORK = https://rhinosecuritylabs.com/assessment-services/network-penetration-testing/
SE = https://rhinosecuritylabs.com/assessment-services/secure-code-review/
WEBAPP = 404 I was wondering if this discrepancy should be taken care of in the Htaccess file, Cloudflare, or WordPress redirect plug-in?0 -
URL Re-Mapping Question ?. Do I need to the theme of my business in my url struture even though GWT knows what my site is about
Hi All, I have currently planning to do some url remapping on my Hire Website as alot of most important pages are far to many levels deep from the root domain. This is also making my sitemap not tidy etc. In GWT, Google knows that the theme is my website is Hire as it's the top word. Therefore do I still need to use the word hire in all my new url categories / structures or not ? Examples http://goo.gl/BFmvk2 I was thinking of remapping to www.xxxxxxx.xco.uk/tool-hire-birmingham http://goo.gl/pC9Bdp I was thinking of remapping to www.xxxxxx.co.uk/cleaning-equipment Notice in the later example , I do not have the word rent in the url. Any advice is much appreciated thanks peter
Web Design | | PeteC120 -
Funnel tracking with one page check-out?
Hi Guys, I'm creating a new website with a one page checkout that follows the following steps:
Web Design | | Jerune
1. Check availability
2. Select product
2. Select additional product & Add features
3. Provide personal information
4. Order & Pay I'm researching if it is possible to track all these steps (and even steps within the steps) with Google Analytics in order to analyse checkout abandonment. The problem is only that my one-page checkout has only one URL (I want to keep it that way) and therefore can not be differentiated on URL in the Analytics funnel. To continue to the next step also the same button (in a floating cart) in used to advance. The buttons to select/choose something within one step are all different. Do you guys know how I can set this up and how detailed I can make this? For example, is it also possible to test at which field visitors leave when for example filling in their personal information? Would be great if you can help me out!0 -
Multiple Local Schemas Per Page
I am working on a mid size restaurant groups site. The new site (in development) has a drop down of each of the locations. When you hover over a location in the drop down it shows the businesses info (NAP). Each of the location in the Nav list are using schema.org markup. I think this would be confusing for search robots. Every page has 15 address schemas and individual restaurants pages NAP is at the below all the locations' schema/NAP in the DOM. Have any of you dealt with multiple schemas per page or similar structure?
Web Design | | JoshAM0 -
Joomla ( title page override not working properly ) any techy guys out there
Hey Mozzers I am having some problems with joomla. I have tried many support forums and since everyone is in the same field as me, i thought this would be a great place to ask this question. I am working with joomla 2.5 and After i have turn on my search engine friendly configuration, you can override the ( alias ) of the page by providing page display options for title tag. so i turned on the SEF in global config and turn on the mod-rewrite and made sure my htaccess file was not txt. But i am having some problems with this.
Web Design | | BizDetox
On some pages the page display option for the _browser page title _works and on some it does. On the pages it doesnt it is pulling the information of the Alias. ( which is common with most site )
Why is it doing this You can check out the pages yourself Here is a page with it not working
http://tungstengem.com/mens-wedding-bands and here is a page with it working
http://tungstengem.com/mens-wedding-...-bands-for-men Also for my homepage when i didnt have my Apach rewrite it show the index.php and i was able to ad an alias to it. Now the Alias for the home page is not working0 -
Still too many internal links reported on page
Hi Guys I am new here, and very much learning a lot, and enjoying the benefits of being an SEOMoz user. So here goes with my first question (probably of many). I have known for sometime that our website has a top heavy number of links in the primary navigation. But I wasn't too sure how important this was. Our main objective was to make an east to use nav for customers. All of the feedback we have had says that customers really like our navigation, as it is easy to use etc etc. However, when running an SEOMoz campaign on our site, again we got back that there are too many links on the pages. Example, home page has 500+ links. So I decided to do something about this. I have implemented what I think is a good solution where by the drop down navigation isn't loaded on first load. If the user then hovers over one of our "departments" the sub navigation is loaded via Ajax and dropped in. This means if the user wants it, they get it, if not then it's not loaded with the page. My theory being that Google loads the page without all the links, but a user gets the links as and when they need them. I tested with the SEOMoz toolbar and this tells me that when I load the home page there is 167 links in it vs 500+ previously. However, the my campaign still tells me that my home page has 450+ links (and this is a recent crawl of the page). Our site is here: www.uniquemagazines.co.uk Can you tell me is what I have done is a) a good solution and b) does the SEOMoz crawler have the ability to trigger the hover event and cause the AJAX load of the sub navigation content?
Web Design | | TheUniqueSEO0 -
Dynamic pages and code within content
Hi all, I'm considering creating a dynamic table on my site that highlights rows / columns and cells depending on buttons that users can click. Each cell in the table links to a separate page that is created dynamically pulling information from a database. Now I'm aware of the google guidelines: "If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few." So we wondered whether we could put the dynamic pages in our sitemap so that google could index them - the pages can be seen with javascript off which is how the pages are manipulated to make them dynamic. Could anyone give us a overview of the dangers here? I also wondered if you still need to separate content from code on a page? My developer still seems very keen to use inline CSS and javascript! Thanks a bundle.
Web Design | | tgraham0