Super URL Keyword Re-Directs - Google Safe ?
-
Hi Guys
Any help would be much appreciated here.
Can anyone tell me if I was to use a keyword super url re-direct could it still be possible to rank high in Google ?? ??
Does Google have an issue with re-directs ??
Thanks Guys
Gareth
-
Great thank you Andy for your help
-
I would have a read of this post Gareth. It runs though some great pointers on Amazon optimisation. There is another good article here on SEJ.
I hope this helps.
-Andy
-
Hi Andy
sorry - yes I mean the URL of my product page , I was thinking of doing some Seo to my product page to help achieve more sales etc..
-
HI Gareth,
...if I should use a re-direct or the normal URL that Amazon creates for you..
Do you mean the URL to that product? And if so, where are you planning on linking from / to?
-Andy
-
thanks Andy
Well I have some products I sell on Amazon and am wondering if I should use a re-direct or the normal URL that Amazon creates for you..
So my thinking was what if I created a URL that had my keywords in and did some SEO on that URL.
Thanks
Gareth
-
Hi Gareth,
Google doesn't have issues with redirects when used correctly, but can you explain what you mean by a keyword super url re-direct?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Website can't break into Google Top100 for main keywords, considering 301 Redirect to a new domain
A little background on our case. Our website, ex: http://ourwebsite.com was officially live in December 2015 but it wasn't On-Site optimized and we haven't done any Off-site SEO to it. In April we decided to do a small redesign and we did it an online development server. Unfortunately, the developers didn't disallow crawlers and the website got indexed while we were developing it on the development server. The development version that got indexed in Google was http://dev.web.com/ourwebsite We learned that it got indexed when we migrated the new redesigned website to the initial domain. When we did the migration we decided to add www and now it looks like: http://www.ourwebsite.com Meanwhile, we deleted the development version from the development server and submitted "Remove outdated content" from the development server's Search Console. This was back in early May. It took about 15-20 days for the development version to get de-indexed and around 30 days for the original website (http://www.ourwebsite.com) to get indexed. Since then we have started our SEO campaign with Press Releases, Outreach to bloggers for Guest and Sponsored Posts etc. The website currently has 55 Backlinks from 44 Referring domains (ahrefs: UR25, DR37) moz DA:6 PA:1 with various anchor text. We are tracking our main keywords and our brand keyword in the SERPs and for our brand keyword we are position #10 in Google, but for the rest of the main (money) keywords we are not in the Top 100 results in Google. It is very frustrating to see no movement in the rankings for the past couple of months and our bosses are demanding rankings and traffic. We are currently exploring the option of using another similar domain of ours and doing a complete 301 Redirect from the original http://www.ourwebsite.com to http://www.ournewebsite.com Does this sound like a good option to you? If we do the 301 Redirect, will the link-juice be passed from the backlinks that we already have from the referring domains to the new domain? Or because the site seems "stuck," would it not pass any power to the new domain? Also, please share any other suggestions that we might use to at least break into the Top 100 results in Google? Thanks.
Intermediate & Advanced SEO | | DanielGorsky0 -
Should I include URLs that are 301'd or only include 200 status URLs in my sitemap.xml?
I'm not sure if I should be including old URLs (content) that are being redirected (301) to new URLs (content) in my sitemap.xml. Does anyone know if it is best to include or leave out 301ed URLs in a xml sitemap?
Intermediate & Advanced SEO | | Jonathan.Smith0 -
Multilingual SEO - site using Google translate within existing URL structure
Hi everyone - I've just been looking at a site that simply uses Google Translate through its website. So basically, on any page you can Google Translate the content to any language you like - there's no change to the URL structure according to language, etc. I haven't come across this approach before (simply allowing users to Google Translate withing the existing page) - and it doesn't sit well with me - let me have your thoughts re: the SEO implications. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Replicating keywords in the URL - bad?
Our site URL structure used to be (example site) frogsforsale.com/cute-frogs-for-sale/blue-frogs wherefrogsforsale.com/cute-frogs-for-sale/ was in front of every URL on the site. We changed it by removing the for-sale part of the URL to be frogsforsale.com/cute-frogs/blue-frogs. Would that have hurt our rankings and traffic by removing the for-sale? Or was having for-sale in the URL twice (once in domain, again in URL) hurting our site? The business wants to change the URLs again to put for-sale back in, but in a new spot such as frogsforsale.com/cute-frogs/blue-frogs-for-sale as they are convinced that is the cause of the rankings and traffic drop. However the entire site was redesigned at the same time, the site architecture is very different, so it is very hard to say whether the traffic drop is due to this or not.
Intermediate & Advanced SEO | | CFSSEO0 -
Advice on URL structure for competing against EMDs of a hot keyword
Here is the question, illustrated with an example: A law client focuses on personal injury. Their domain is nondescript. The question comes into the URL structure for an article section of the site (I think I know what most people here will say, but want to raise this anyway). This section will have several hundred 'personal injury' articles at launch, with 100+ added each month by writers. Most articles do not mention 'personal injury' in the titles or in the content, but focus on the many areas in which people can hurt themselves :-). Spreading a single keyword emphasis across many pages/posts is considered poor form by many, but the counter-argument is that hundreds of articles, all with 'personal injury' in the URL, could increase the overall authority of the site for that term (and may compete more strongly with EMD competitors). For instance, let's say Competitor A has this article: www.acmepersonalinjury.com/articles/tips-if-in-car-accident And we had the following options: Option A: www.baddomain.com/articles/tips-if-in-car-accident Option B: www.baddomain.com/personal-injury-articles/tips-if-in-car-accident Of course, for the term "car accident", Option A seems on equal footing with the ACME competitor. But, what about the overall performance of the "personal injury" keyword (a HOT keyword in this space)? Would ACME always have an advantage (however slight) due to its domain? Would Option B help in this regard? The downside of course is that this pushes "car accident" further down in the URL string, making all articles perhaps less competitive on their individual keywords.
Intermediate & Advanced SEO | | warpsmith0 -
Re-Direct Users But Don't Affect Googlebot
This is a fairly technical question... I have a site which has 4 subdomains, all targeting a specific language. The brand owners don't want German users to see the prices on the French sub domain and are forcing users into a re-direct to the relevant subddomain, based on their IP address. If a user comes from a different country, (ie the US) they are forced on the UK sub domain. The client is insistent on keeping control of who sees what (I know that's a debate in it's own right), but these re-directs we're implementing to make that happen, are really making it difficult to get all the subdomains indexed as I think googlebot is also getting re-directed and is failing to do it's job. Is there are a way of re-directing users, but not Googlebot?
Intermediate & Advanced SEO | | eventurerob0 -
Is it better to use geo-targeted keywords or add the locations as separate keywords?
For example... state keyword (nyc real estate) or keyword, state (nyc, real estate) = 2 keywords Thanks in advance!
Intermediate & Advanced SEO | | Cyclone0