How to fix this issue?
-
I redesign my website from Wix to HTML.
Now URLs changed to _
http://www.spinteedubai.com/#!how-it-works/c46c
To
http://www.spinteedubai.com/how-it-works.html
Same for all other pages. How I can fix this issue and both pages were also indexed in google.
-
Hi Alexander,
While there are some server-side technical things you can do to force a 404 error for a given URL, the best thing to do is remove the content in question from your server. At the very least this should achieve getting a 404 status code when you attempt to visit the URL that once housed the content. Ideally, if you can configure a custom 404 page that is more user-friendly, that's even better.
Now, depending on how your server is configured, there may be instances when a URL should produce a 404 error, but doesn't. I only bring this scenario up as a possibility because it's something I am currently dealing with on one of the sites I manage.
In any case, you may need to work closely with your server administrator or Web developer to achieve what you need. Most likely, it's just a matter of removing the old content from the server. Hope that helps!
Dana
-
How can I add 404 error? What are the steps?
-
Hi Alexander,
It looks like you've implemented the canonical tags properly. It can, however, take Google a very, very long time (sometimes years) to remove old content. If you really want the old page/URL out of Google's index, the very best and quickest way to achieve that is to make sure that the old page produces a proper 404 status code then use GWT's Remove URL tool to request Google to remove it from their index. This still isn't immediate, but I've seen URLs removed in as little as a week using this method. Hope that helps!
Dana
-
Hi Alexander,
You can either 301 the old page http://www.spinteedubai.com/#!how-it-works/c46c into the new page http://www.spinteedubai.com/how-it-works.html
or you can set up rel=canonical tag if its the same content and you want the keep the URL.
You would then have to either wait or use this to remove the URL - https://www.google.com/webmasters/tools/removals
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cross domain canonical issue
Hello fella SEOs! I have a very intriguing question concerning different TLDs across the same domain. For eg: www.mainwebsite.com, www.mainwebsite.eu, www.mainwebsite.au, www.mainwebsite.co.uk etc... Now, assuming that all these websites are similar in terms of content, will our lovely friend Google consider all these TLDs as only one and unique domain or will this cause a duplicate content problem? If yes, then how should I fix it? Thnx for your precious help guys!
Technical SEO | | SEObandits1 -
My site was hacked and spammy URLs were injected that pointed out. The issue was fixed, but GWT is still reporting more of these links.
Excuse me for posting this here, I wasn't having much luck going through GWT support. We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere. I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website. If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?
Technical SEO | | jampaper0 -
Gallery Causing Duplicate Content Issues
Hi! I have a gallery on my website. When you click to view the next image it goes to a new page but the content is exactly the same as the first page. This is flagging up a duplicate content issue. What is the best way to fix this? Add a canonical tag to page 2,3,4 or add a noindex tag? I have found a lot of conflicting answers. Thanks in advance
Technical SEO | | emma19860 -
How to fix errors and warnings on a wordpress.com hosted site ?
Hello Mozers, I've 18 4xx errors ,812 duplicate page content and 412 duplicate page titles with about 605 too many links warning and about 4900 notices.. My website is hosted on wordpress.com and I just do not understand how do i fix these errors . To add on, last week the errors were lesser by 150 !! How do I get these issues fixed ? Please assist !!! Thanks , VIkash
Technical SEO | | mysayindia0 -
Indexing Issue
Hi, I am working on www.stjohnswaydentalpractice.co.uk Google only seems to be indexing two of the pages when i search site:www.stjohnswaydentalpractice.co.uk I have added the site to webmaster tools and created a new sitemap which is showing that it has only submitted two of the pages. Can anyone shed any light for why these pages are not being indexed? Thanks Faye
Technical SEO | | dentaldesign0 -
How do I Address Low Quality/Duplicate Content Issue for a Job portal?
Hi, I want to optimize my job portal for maximum search traffic. Problems Duplicate content- The portal takes jobs from other portals/blogs and posts on our site. Sometimes employers provide the same job posting to multiple portals and we are not allowed to change it resulting in duplicate content Empty Content Pages- We have a lot of pages which can be reached via filtering for multiple options. Like IT jobs in New York. If there are no IT jobs posted in New York, then it's a blank page with little or no content Repeated Content- When we have job postings, we have about the company information on each job listing page. If a company has 1000 jobs listed with us, that means 1000 pages have the exact same about the company wording Solutions Implemented Rel=prev and next. We have implemented this for pagination. We also have self referencing canonical tags on each page. Even if they are filtered with additional parameters, our system strips of the parameters and shows the correct URL all the time for both rel=prev and next as well as self canonical tags For duplicate content- Due to the volume of the job listings that come each day, it's impossible to create unique content for each. We try to make the initial paragraph (at least 130 characters) unique. However, we use a template system for each jobs. So a similar pattern can be detected after even 10 or 15 jobs. Sometimes we also take the wordy job descriptions and convert them into bullet points. If bullet points already available, we take only a few bullet points and try to re-shuffle them at times Can anyone provide me additional pointers to improve my site in terms of on-page SEO/technical SEO? Any help would be much appreciated. We are also thinking of no-indexing or deleting old jobs once they cross X number of days. Do you think this would be a smart strategy? Should I No-index empty listing pages as well? Thank you.
Technical SEO | | jombay3 -
How do I fix a multiple domain mess?
I just picked up an account that is a franchisee and they have 6 exact match domains plus their main domain (all exact duplicates, not 301s). GA shows the main domain as getting the lion's share of hits, but for some important local keywords, the exact match domains rank higher. Some pages may have the exact match domain, primary domain and the franchise domain all ranking on the same SERP. Yuk! My strategy is to work on the main domain and as the work progresses, the main site will surpass the exact match domains and main franchise domain for the important searches. For the exact match domains I plan on just leaving them alone. Is this a sound strategy? I could pull the exact match domains down, but since they rank well for their keywords, it seems most sensible to leave them up. What do you think?
Technical SEO | | KristinnD0 -
Overly-Dynamic Urls how to fix in SEOMOZ?
Hello. I have about 300 warnings of overly-dynamic urls. In urls like this: http://www.theprinterdepo.com/clearance?dir=asc&order=price&p=10 As you can see all parameters are needed, and my ecommerce solution generates them automatically. How can I get rid of these warnings? I suppose that by using robots.txt, but I have no idea about it. In my google webmaster tools I have already configured that these parameteres the crawler should not index them. Check the image here: http://imageshack.us/photo/my-images/64/37092444.png/
Technical SEO | | levalencia10