Renamed a page and created a 301, page lost its rankings.
-
We changed a page name to fall under the root of our site from domain.com/page1/page301d/ to domain.com/page301d/ and after 2 weeks it still is not back to its #3 position. Now it is on the bottom of page 3. I cant figure out what im doing wrong here.
The original .com/page1/ that this page fell under was removed totally and redirected to antoher page that was more relevant. I went ahead and re-enabled this page and its contnent, because the page was linking out to the page we 301d. This page we re-enabled had about 150 links poitning to it and therefore i was thinking that maybe the link juice from this page (or relevancy) via an internal link was helping it rank. This was updated about 6 days ago and the internal link is back
Any other ideas why this might not be working. Ive checked all the 301s, content has not changed on the page.
We have updated the strcuture for many pages. Instead of having the pages in question fall under anotehr page, they all fall under the root and its sub content is now only 2 levels deep , instead of being 3.
hope that makese sense.
-
Thanks fellas for the replies. After a week and a half the rankigns came back and we actually moved up one spot. Essentially i removed 301's that were very old and were giving google 3 jumps to the new page (meaning there were mulitple old pages that redirected in order of history to the newest one). I also 301d all the old pages to the new page. We also re-enabled a page that was the parent directory of this page (this page linked to the page in question)and had about 150 links (orignally 301d to another page).
Thanks
-
Just as Matt says not all 301 are equal.
And you need to check your sitemap about changing URL. Maybe sitemap holds also old URL?
You need to check your content about old URL too and change it to new one.
You need to check your canonical about old URL and to replace with new one.For me this is technical SEO/onpage SEO issue and you forget something with 301 redirect.
-
301 redirects do not pass 100% of the link "juice" you have earned for the original page. There's a longer discussion about it here: https://moz.com/community/q/how-much-juice-do-you-lose-in-a-301-redirect
Most SEOs I know say it generally passes about 90% of the value. But what I've found is that it passes more as each link is rediscovered by Google as well. So if you had an eBay homepage link and it's scraped every 10 minutes, you'd get the juice from that link back much quicker than a blog post from 2011 that Google may never visit again.
I would say just keep working on the site & move forward. You'll either always be around 90% of the previous juice so keep working or you'll gain most of it back over time as those links are re-checked, so ... keep working.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
For an e-commerce product category page that has several funnels to specific products, for SEO purposes does it matter whether the category page's overview content is above or below those funnels?
We manage an e-commerce site. On a category page, there are several funnels to specific products. We moved the category overview content below those funnels to make it easier for users to quickly get to products. Seems more user friendly to me, but could that move of the main content to the lower part of the page be a negative ranking factor?
On-Page Optimization | | PKI_Niles0 -
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
On-Page Optimization | | Jacksons_Fencing0 -
Page grader says we are keyword stuffing but we arn't. Page source shows different story.
Hi community! We have just run a page grader for the keyword 'LED Bulbs' on whichledlight.com and it comes up that we are keyword stuffing! However, a brief look at the source for the homepage and there's only 6 times that LED Bulbs pops up. We do have the non plural version of the word 'LED Bulb' on the page 27 times.. do we think that would contribute to the keyword stuffing? Thanks!!
On-Page Optimization | | TrueluxGroup0 -
Impact of number of outgoing links on Page Rank of an optimized page?
What is the current best practice on preferred number of outbound links on a page you are trying to rank with: According to online resources form a pure page rank perspective a high number of outbound follow links can have a negative impact not only on child pages but also the page itself
On-Page Optimization | | thomaspro
http://pr.efactory.de/e-outbound-links.shtml Other resources suggest that particularly placing high quality outbound links on a page (nofollow) increases the trust and authority of a page Are there any other elements to keep in mind? Is the best practice to avoid any follow links on a page you want to rank well in Google for? Thanks /T0 -
Description tag not showing in the SERPs because page is blocked by Robots, but the page isn't blocked. Any help?
While checking some SERP results for a few pages of a site this morning I noticed that some pages were returning this message instead of a description tag, A description for this result is not avaliable because of this site's robot.s.txt The odd thing is the page isn't blocked in the Robots.txt. The page is using Yoast SEO Plugin to populate meta data though. Anyone else had this happen and have a fix?
On-Page Optimization | | mac22330 -
Google Indexed = 35, 445 pages, Bing Indexed = 243 pages... Why?
Dear MozSquad, Can anyone check our site and let me know if there's anything super apparent that would cause Bing to treat us like a bum on the street? I recently made some structural changes which really helped with Google, but Bing didn't even budge. It's a lot harder to keep up with all the SEO initiatives I have in mind with it being a small start-up where I'm responsible for planning the entire Internet Marketing campaign, giving constant input on UX and site design, etc on top of 900 other things, so I figured it'd be a good time to use The Moz to help a brother out. Ideas? Domain: homeandgardendesignideas.com (yeah, I know it's a little long =P)
On-Page Optimization | | zDucketz0 -
Inner page rank higher than domain
Hey there, Today my site received it's first Google pagerank and noticed something strange. My domain got a PR of 2 whilst most of the other inner pages got PR1, so far so good. But then one particular inner page which is just 2 weeks old and has zero backlinks has PR 3. I'm not one to fixate on PR but this just struck me as weird. What do you guys think?
On-Page Optimization | | barabis770