My urls changed with new CMS now search engines see pages as 302s what do I do?
-
We recently changed our CMS from php to .NET. The old CMS did not allow for folder structure in urls so every url was www.mydomain/name-of-page.
In the new CMS we either have to have .aspx at the end of the url or a /. We opted for the /, but now my page rank is dead and Google webmaster tools says my existing links are now going through an intermediary page. Everything resolves to the right place, but looks like spiders see our new pages as being 302 redirected.
Example of what's happening.
Old page: www.mydomain/name-of-page
New page: www.mydomain/name-of-page/
What should I do? Should I go in and 301 redirect the old pages? Will this get cleared up by itself in time?
-
I assume you have used some kind of rewrite on the server to change URLS to / rather than .aspx. This could be utilising some kind of redirect to move traffic from .aspx to /.
Firstly, I'd check how the URLs are rewritten (if they are), Unfortuntaley my knowledge of windows server configs is limited, however in Linux I'd be checking my HT Access file and the redirect/rewrite rules.
Once you have ascertained how the redirect us happening and solve the issue, then i'd implement the correct 301 redirects.
I'd like to ask why you opted for name-of-page/ rather than name-of-page.aspx?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
UGC - Product Reviews Search Engine accesibility
Hi everyone A question about using product reviews on a website - I completely understand genuine product reviews can be great for creating totally unique content and for seo in general. But my understanding is also that it is great only when it's accessible by google. So the question I have is that by using aggregate review services - i.e. reviews(dot)co(dot)uk widgets, are they always easily accessible by Google? Examples I've seen that use the widget are: https://www.oliverbonas.com/jewellery/silver-honeycomb-bee-necklace-29494 http://www.fires-cookers.co.uk/hoover-dyc169a-tumble-dryer-white.html#prod-reviews So when you view the page source for these pages, the actual content of the reviews are nowhere to be seen. I was wondering if anyone has experience of using any of these thrid party widgets, and if these come as issues, what you would do to make sure these reviews content are seen by Google? Thank you
Technical SEO | | MH-UK0 -
How to handle Friendly URLs together with internal filters search?
I've been trying to handle URLs from a unfrendly folder format to a semantic one, the thing is by doing so I end up with a longer URL and therefore a longer Title. Right now the format of my classified site for job seeking looks like this (folders): site.com/search/category/sales/level/supervisor/location/seattle/company/acme/published/2-days-ago/type/full-time/q/salesman/page/2 format: Filter/Content where at the end q is the query people are writting My suggestion is the following: Mixing Jobs with location, mixing category and level, and puting the rest of the filters at the end adding "--" between them. And adding 2 parameters, query (q) and pagination (pag) site.com/jobs-at-seattle/sales-supervisor/company/acme/full-time--published-2-days-ago?q=salesman&pag=2 Any thoughts on how to handle URLs over 100 chararcters and titles that go over 65?, or maybe is ok to have "friendly" long URLs and long titles when it comes to classified ad sites since they are based on internal filters to help people find what they are looking for. Sidenote: Is itok to have 2 parameters in the URL (Query and Pagination) Thanks a lot.
Technical SEO | | JoaoCJ0 -
Category URL Pagination where URLs don't change between pages
Hello, I am working on an e-commerce site where there are categories with multiple pages. In order to avoid pagination issues I was thinking of using rel=next and rel=prev and cannonical tags. I noticed a site where the URL doesn't change between pages, so whether you're on page 1,2, or 3 of the same category, the URL doesn't change. Would this be a cleaner way of dealing with pagination?
Technical SEO | | whiteonlySEO0 -
301 redirects - one overall redirect or an individual one for each page url
Hi I am working on a site that is to relaunch later on this year - is best practise for the old urls (of which there are thousands) to write a piece of code that will cover all of the urls and redirect them to the new home page or to individually redirect each url to its new counterpart on the new site. I am naturally concerned about user experience on this plus losing our Google love we currently have but am aware of the time it would take to do this individually. Any advice would be appreciated. Thanks
Technical SEO | | Pday1 -
Noindex search result pages Add Classifieds site
Dear All, Is it a good idea to noindex the search result pages of a classified site?
Technical SEO | | te_c
Taking into account that category pages are also search result pages, I would say it is not a good idea, but the whole information is in the sitemap, google can index individual listings (which are index, follow) anyway. What would you do? What kind of effects has in the indexing of the site, marking the search result pages as "search results" with schema.org microdata? Many thanks for your help, Best Regards, Daniel0 -
Rebranding and New Domain Effects on Search Traffic
If we are to change our branding and set up a new domain name to match that branding, we will need to set up redirects for all URLs. The new website would exactly match the old. Using HTACCESS we can redirect all URLs fairly easily. Will this exercise cause use to lose authority?
Technical SEO | | designquotes0 -
Once duplicate content found, worth changing page or forget it?
Hi, the SEOmoz crawler has found over 1000 duplicate pages on my site. The majority are based on location and unfortunately I didn't have time to add much location based info. My question is, if Google has already discovered these, determined they are duplicate, and chosen the main ones to show on the SERPS, is it worth me updating all of them with localalized information so Google accepts the changes and maybe considers them different pages? Or do you think they'll always be considered duplicates now?
Technical SEO | | SpecialCase0 -
Advice on display this content on my page for search engines
Hi, my website http://www.in2town.co.uk/Holiday-News is about bringing travel and holiday news to our readers of our lifestyle magazine but i am having problems at the moment with the layout. What i mean by this is, i have written content on the page as an introduction so google knows what this section of the site is about but to be honest it looks rubbish with having the introduction there and i would like to know if i am doing the right thing by having the content there for google to know what my site is about. I have tried taking it away and noticed i dropped in the rankings and when i have put it back up i go up in the rankings, can anyone please give me some advice over this issue
Technical SEO | | ClaireH-1848860