External URLs - ones you can't reach out
-
My fellow Mozzers,
I have been reviewing our Google Webmaster error reports and noticed high url errors. These URLs are generated from international sites, mainly China. Upon further inspection, these look to be links to dynamic URLs that are no longer active on our site. (search pages)
China is linking to old URLs that simply spew out a 'Bad Request' pages now. Problems I face is that:
- I can't contact these chinese sites to remove/edit the URLs.
- I could work with my developers to identify the urls and direct them all to the homepage, but is that good. The URLs are still present.
- Some of these look like pages that haven't been updated in a while, so now I have links from sites that are archived, or "dead"
Have you tackled anything like this before? Thoughts are welcome
Thanks
-
Agreed. Great answer Highland.
-
I'm actually looking into creating a dynamic block on a static 404 page. The block would feed in products/data based on the URL string, so it recognizes the URL and displays relevant content.
The thing with the chinese urls is that, they introduced a unique character to the URL and that's why I don't get a 404 page but a 'bad request' page. Servers don't recognize the unique character. In talks with the developer to see if we can do something to direct them to a 404.
Thanks
-
I would just 404 the pages (provided they're not already) and move on. There's nothing wrong with having dead links to your site. Some people would recreate the pages to try and capture some of that link juice but if they're from abandoned Chinese sites I don't think they would provide the high quality you need.
If you want the best of both worlds, have the page return a 404 but return some content that would direct anyone who happened to click in to some other part of your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Houston Company Needs Help (Will Our SEO Work Be Destroyed While Site is Down?, Can Anything be Done?)
I'm a Moz member, mostly just lurk, and love the Moz community. I work at a non profit company that does benchmarking data and helps school districts improve process in education. Our building flooded and our site is currently offline, is there anything we can do to stop/lesson any SEO page rank drop between now and when we are back up? We have worked very hard to get these rankings. I know it is minor compared to all tragedy in Houston, but we have worked hard to get these SEO gains (YEAH MOZ!) and I'd hate to lose them because of Harvey. Any suggestions, assistance appreciated! Ralph
SERP Trends | | inhouseninja1 -
URL Parameter for Limiting Results
We have a category page that lists products. We have parameters and the default value is to limit the page to display 9 products. If the user wishes, they can view 15 products or 30 products on the same page. The parameter is ?limit=9 or ?limit=15 and so on. Google is recognizing this as duplicate meta tags and meta descriptions via HTML Suggestions. I have a couple questions. 1. What should be my goal? Is my goal to have Google crawl the page with 9 items or crawl the page with all items in the category? In Search Console, the first part of setting up a URL parameter says "Does this parameter change page content seen by the user?". In my opinion, I think the answer is Yes. Then, when I select how the parameter affects page content, I assume I'd choose Narrows because it's either narrowing or expanding the number of items displayed on the page. 2. When setting up my URL Parameters in Search Console, do I want to select Every URL or just let Googlebot decide? I'm torn because when I read about Every URL, it says this setting could result in Googlebot unnecessarily crawling duplicate content on your site (it's already doing that). When reading further, I begin to second guess the Narrowing option. Now I'm at a loss on what to do. Any advice or suggestions will be helpful! Thanks.
SERP Trends | | dkeipper0 -
Google News results ...can it be SEOed?
Hello Everyone. I simply wanted to know if anyone had some useful insight on what it takes for a legitimate website to appear within the Google News results. I have rarely, or ever, had to dabble in this kind of SEO, but after coming across a situation with a perfectly legitimate website, I'm now scratching my head. The site in question is a very well established website, with 0 "seo" done to it. All links organic, all traffic legit and they have VERY strong social media presence. The site's current DA is 50. Its a 3 letter domain. Some of the points I believe are important quantity and quality of content (% of aggregated vs actually original content) overall % of "news" content vs rest of the site content authors/writers credentials (how would Google evaluate the authority of a writer, so his/her content is newsworthy?) overall site authority rich snippet and code needed to be indexed? I think rel publisher or rel author tags have something to do with it? making sure to have basic SEO in place: canonical tag, unique headers, etc. What am I missing? They have one particular competitor that seem to be ranking for almost everything news related, while being a similar site in content and authority, however they are nowhere. They have submitted to Google News before (not even sure what that means) but have failed to be included -- does this put a "stain" on them for any reason or impede the possibility of being indexed in the Google News results in the future? ANY input is appreciated.
SERP Trends | | 1stOTLJK0 -
Can someone explain this about Etsy/Pinterest SEO?
Hello Moz Wizards, Could someone help me out with this? It's not related to my own site's SEO...but maybe you have some perspective on how Etsy and Pinterest do SEO... The issue is this - until recently, I was certain that individual stores/listings in Etsy (maybe Pinterest too) did not show up search results apart from the meta Etsy listing. Now I'm seeing competitors on Etsy show up as separate search results - not via their own websites, but their actual store or listing within Etsy. How is that done? I don't ever recall that happening, as I thought Etsy controlled the SEO access to their stores/listings. More importantly, how can I do that? Thanks for any insights you may have. Sharon
SERP Trends | | Sharon20160 -
Google crawled "Cross Domain" links. Is this an issue? if yes then how we can remove it?
We have two sites. One site for US (site.com)and another is for India(site.co.in). Redirection is working based on ip address and location.So if I write US(site.com) site link in India location then it will automatically redirect to India site(site.co.in). We have given a link in the header of the site which will be used to open another country site. Example : Currently India site open in my browser so it contain US site link. and US site open in my browser so it contain India site link. We check on google web master and we found back links of cross domains so Is it an issue?
SERP Trends | | Shanil1230 -
Long URL Warning
Dear experts, I have 1490 warnings for having long URLs These URLs are generates automatically by Prestashop from the product title and they are very readable. Can you please direct me what's the impact of these long URLs on my SEO and how can I reduce them if they are automatically generated? Regards,
SERP Trends | | kanary0 -
Can I submit multiple data feed to Google Merchant Center?
I have submitted my product feed to Google Merchant Center. There are 7K+ products in one product feed. (data_feed.txt) I am not satisfy with my products' performance on Google shopping. Because, I am not able to maintain data quality for all products as well as not able to focus on high selling categories. Can I submit multiple product feed with separation of categories as follow. category001_data_feed.txt (With 200 High selling products) category002_data_feed.txt (With 100 offers products)
SERP Trends | | CommercePundit0 -
Why are search results different for 'Yahoo Search' and Powered by Yahoo search?
So if you misspell something (no dns to connect with... unregistered domain as an example) on a browser that uses a Road Runner connection, it takes you to a search engine 'powered by yahoo search' which searches for whatever you typed in. Anyway, today we noticed that the website of one of our clients isn't coming up in this 'Powered by Yahoo Search' via roadrunner but comes up in Yahoo Search for the same query. In fact, we noticed that our clients website (10 yrs + established site) is missing in yahoo powered by search but on all other engines. Any clues? I thought this should be generally the same across Yahoo / Yahoo Powered / Bing / Bing Powered. Could this be using something pre-bing merger?
SERP Trends | | qlkasdjfw0