External URLs - ones you can't reach out
-
My fellow Mozzers,
I have been reviewing our Google Webmaster error reports and noticed high url errors. These URLs are generated from international sites, mainly China. Upon further inspection, these look to be links to dynamic URLs that are no longer active on our site. (search pages)
China is linking to old URLs that simply spew out a 'Bad Request' pages now. Problems I face is that:
- I can't contact these chinese sites to remove/edit the URLs.
- I could work with my developers to identify the urls and direct them all to the homepage, but is that good. The URLs are still present.
- Some of these look like pages that haven't been updated in a while, so now I have links from sites that are archived, or "dead"
Have you tackled anything like this before? Thoughts are welcome
Thanks
-
Agreed. Great answer Highland.
-
I'm actually looking into creating a dynamic block on a static 404 page. The block would feed in products/data based on the URL string, so it recognizes the URL and displays relevant content.
The thing with the chinese urls is that, they introduced a unique character to the URL and that's why I don't get a 404 page but a 'bad request' page. Servers don't recognize the unique character. In talks with the developer to see if we can do something to direct them to a 404.
Thanks
-
I would just 404 the pages (provided they're not already) and move on. There's nothing wrong with having dead links to your site. Some people would recreate the pages to try and capture some of that link juice but if they're from abandoned Chinese sites I don't think they would provide the high quality you need.
If you want the best of both worlds, have the page return a 404 but return some content that would direct anyone who happened to click in to some other part of your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Houston Company Needs Help (Will Our SEO Work Be Destroyed While Site is Down?, Can Anything be Done?)
I'm a Moz member, mostly just lurk, and love the Moz community. I work at a non profit company that does benchmarking data and helps school districts improve process in education. Our building flooded and our site is currently offline, is there anything we can do to stop/lesson any SEO page rank drop between now and when we are back up? We have worked very hard to get these rankings. I know it is minor compared to all tragedy in Houston, but we have worked hard to get these SEO gains (YEAH MOZ!) and I'd hate to lose them because of Harvey. Any suggestions, assistance appreciated! Ralph
SERP Trends | | inhouseninja1 -
URL Parameter for Limiting Results
We have a category page that lists products. We have parameters and the default value is to limit the page to display 9 products. If the user wishes, they can view 15 products or 30 products on the same page. The parameter is ?limit=9 or ?limit=15 and so on. Google is recognizing this as duplicate meta tags and meta descriptions via HTML Suggestions. I have a couple questions. 1. What should be my goal? Is my goal to have Google crawl the page with 9 items or crawl the page with all items in the category? In Search Console, the first part of setting up a URL parameter says "Does this parameter change page content seen by the user?". In my opinion, I think the answer is Yes. Then, when I select how the parameter affects page content, I assume I'd choose Narrows because it's either narrowing or expanding the number of items displayed on the page. 2. When setting up my URL Parameters in Search Console, do I want to select Every URL or just let Googlebot decide? I'm torn because when I read about Every URL, it says this setting could result in Googlebot unnecessarily crawling duplicate content on your site (it's already doing that). When reading further, I begin to second guess the Narrowing option. Now I'm at a loss on what to do. Any advice or suggestions will be helpful! Thanks.
SERP Trends | | dkeipper0 -
Google News results ...can it be SEOed?
Hello Everyone. I simply wanted to know if anyone had some useful insight on what it takes for a legitimate website to appear within the Google News results. I have rarely, or ever, had to dabble in this kind of SEO, but after coming across a situation with a perfectly legitimate website, I'm now scratching my head. The site in question is a very well established website, with 0 "seo" done to it. All links organic, all traffic legit and they have VERY strong social media presence. The site's current DA is 50. Its a 3 letter domain. Some of the points I believe are important quantity and quality of content (% of aggregated vs actually original content) overall % of "news" content vs rest of the site content authors/writers credentials (how would Google evaluate the authority of a writer, so his/her content is newsworthy?) overall site authority rich snippet and code needed to be indexed? I think rel publisher or rel author tags have something to do with it? making sure to have basic SEO in place: canonical tag, unique headers, etc. What am I missing? They have one particular competitor that seem to be ranking for almost everything news related, while being a similar site in content and authority, however they are nowhere. They have submitted to Google News before (not even sure what that means) but have failed to be included -- does this put a "stain" on them for any reason or impede the possibility of being indexed in the Google News results in the future? ANY input is appreciated.
SERP Trends | | 1stOTLJK0 -
Why Google shows site's in serp coming with wikipedia link as of its brand name?
Hi Mozzers & Moz Team, Let's know how actually this happens to come in Google SERPs with wikipedia link as of its brand name?
SERP Trends | | Futura
It is great to show, but i really want how this comes with brand associated link of wikipedia. Some sources also come wtih DMOZ if ones don't find in wikipedia. Please find the snapshot, where i borderd in red color , may show you clear in it. Waiting the responses. ujknjVr.jpg0 -
Long URL Warning
Dear experts, I have 1490 warnings for having long URLs These URLs are generates automatically by Prestashop from the product title and they are very readable. Can you please direct me what's the impact of these long URLs on my SEO and how can I reduce them if they are automatically generated? Regards,
SERP Trends | | kanary0 -
Can some keywords get penality? - all situation
Last 3 years we created backlinks with 3 main anchors for our website.
SERP Trends | | bele
Domain name example is www.jackusedcars.com , keywords: bmw , audi , mercedes. We have chosen some big keywords as our main keywords. And some more small search volume words: buy used cars, used cars sell off. 90% of backlinks are with main keywords. OSE:
BMW 2,505 162,638
audi 1,111 209,542
mercedes 735 64,649
used cars 382 28,368
car sale 136 8,517
toyota 108 13,106
buy used car 34 820
car sell off 28 710
usedcars.com 26 45
sold cars 23 472 Website title example is: BMW Shop, buy Audi and Mercedes used cars 90% of backlinks are to index page. (Now we have Linking Root Domains 5,158; Total Links: 512k) all backlinks are related, we never used any auto spam tool etc.
in 2011, November 16-30th "audi" keyword traffic dropped, around -80%. Other keywords were ok.
We haven't been kicked by penguin in august 24. Graph was the same. Since November, our index page traffic dropped by 70%... 1st question: We got penalized for overoptimizing with "AUDI" keyword? Or its just another reason it stopped driving traffic? I know that for such linkbuilding we could get kicked by penguin on next update. So now we are de-optimizing the website, changing our old backlinks to different anchors and different urls (car pages - with car name anchors). We are quite good in the search results with product pages now. 1st page always - serp depends on the competition. For example "Used BMW 530 car for sale". We are creating new backlinks like this:
10% to index with different anchors - not the old big ones
20% to http://italian.jackusedcars.com with italian anchors
20% to http://www.jackusedcars.com/search/bmw530 with "buy bmw530", "bmw530 sale" and so on
50% to product names http://www.jackusedcars.com/BMW-530-i-x-2009-full-options.html with product name anchor We still want to get our traffic back with popular keywords. We have them written in title and keyword density on-page is 0.99% (previously was 1.37%).
Every month we lose around 10% of traffic to index page. We were with these keywords in top3, and now only 1 keyword is somewhere in top10, others are not even in top50. 2nd question: Should we remove "BMW, AUDI, Mercedes" from title? (they are still driving us around 20% traffic + 20% with bmw sub-keywords) We could lose almost 50% of total traffic. Only sub-pages will drive traffic with non-popular keywords. We have plans of making page www.jackusedcars.com/bmw and optimize it with "BMW" keyword. Could it go through?
Some old backlinks would be changed to this page.
Our best conversion is with these main keywords, so we really need to get them back. All comments are welcome. Graph attached. visitors_yearly.jpg0 -
My customer has about 50 domain names they own, what can I suggest they do with them to increase SEO?
This customer has purchased about 50 domains over the years and has them all redirecting to their main website. What could I suggest they do with all of these domain names to increase SEO? Any ideas are greatly appreciated... Thanks!
SERP Trends | | jboddiford1 -
Why are search results different for 'Yahoo Search' and Powered by Yahoo search?
So if you misspell something (no dns to connect with... unregistered domain as an example) on a browser that uses a Road Runner connection, it takes you to a search engine 'powered by yahoo search' which searches for whatever you typed in. Anyway, today we noticed that the website of one of our clients isn't coming up in this 'Powered by Yahoo Search' via roadrunner but comes up in Yahoo Search for the same query. In fact, we noticed that our clients website (10 yrs + established site) is missing in yahoo powered by search but on all other engines. Any clues? I thought this should be generally the same across Yahoo / Yahoo Powered / Bing / Bing Powered. Could this be using something pre-bing merger?
SERP Trends | | qlkasdjfw0