URL Structure Q - /UniqueURL/ProductA or /SubcategoryURL/ProductA?
-
Hi Mozers,
I have a niche ecommerce site http://www.ecustomfinishes.com that sells custom barn wood furniture. I have about 600 products online. 2 weeks ago I started rewriting my urls from /subcategoryurl/ProductA to /UNIQUEURL/productA for my individual products,
For example for my subcategory farm tables (150 products) I had
- /rustic-farm-tables/productA,
- /rustic-farm-tables/ProductB ...."rustic-farm-table" about 150 times.
2 weeks ago I started changing the 150x "/rustic-farm-table/" to a more descriptive URL such as
- /white-farm-table/producA
- /rustic-square-dining-table/ProductB
- /Black-harvest-table/ProductC
Here is why I am need advice:
- I have 1181 pages,
- the page with the most entrances with "rustic-farm-tables" is #31/1181 based on entrances. the 2nd most is #71/1181
- Alternatively, I have 13 table product pages such a as /12ft-Rustic-Farm-Dining-Table-p/12-foot-table-with-inlay.htm" that get more entrances than any product that includes "rustic-farm-tables"
- Since changing the urls to be product specific, my overall traffic has dropped 20%!!!
So here is my question:
- do i continue to have the /UNIQUEURL/product be unique to the product, which is consistant amongst my best preforming pages, yet has dropped my traffic 20% in the last 2 weeks, OR do i keep /SAME-URL/product which written as a best practice, and be happy with the traffic I had?
- Could the 20% drop just be a temporary shock? Why would this happen?
This would be a good long tail/head term experiment. Try to get more head terms, or do what you can do focus on long tail. I hope i was able to explain this well, I say follow the best practices of my best preforming pages, however the 20% drop has me worried.
Thank you in advance for your help
-
Hi Simon,
Thank you for your response. I understand the google algo update, and according to matt cutts it seems like this is for urls that are intended to pick up on one search, like www.seohelpinmassachussetts.com, but wouldnt effect website.com/seo-help-in-mass.
To your second comment, my apologies if I did not explain clearly enough, but the driver for my changes is that some of my product pages are getting a lot of entrance traffic and most are getting very very little traffic. I want more entrances for the product pages, so i am replicating what is working. Specifically, all/most of the pages that are getting the high entrances have uniqueurl/product, whereas the ones with /subcategory/product are all getting very little. The shock is my drop in traffic, despite replication of what is working best for my pages. This posted question is to see if anyone has any reactions suggestions or cautions before i make this major adjustment to my site.
best,
Chris
-
Hi Chris
Your timing was quite unfortunate as Google rolled out 2 fairly major algorithm changes on 27th Sept - 2 weeks ago today. This may have muddied the waters a little with regards to understanding the reason for your drop in traffic.
It is always quite risky doing a lot of URL re-writing at once - presumably you put 301 redirects from the old to new URL structure? (in case you didn't that is a crucial step in altering the structure of your URLs / changing page names, and may be related to the traffic drop - some useful pointers on that here: http://www.seomoz.org/q/transferring-to-new-url-structure-301-existing-ones)
I am not sure I understand your driver for doing the changes - the URLs were already unique - as long as the productA name is relevant, unique and descriptive then it probably doesn't warrant too much time changing things.
I'd suggest keeping the same-url/product structure and ensure your other on-page SEO is tight and focus on trying to create some new unique interesting content to go with your products to make them stand out. Maybe pictures / stories of some of your furniture in the home environment or something along those lines
Cheers
Simon
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Can you arrange Google Analytics source/medium traffic by percentage change?
I'm doing a year to year traffic audit for a client. I would like to analyze Google Analytics source/medium traffic by percent change. Is there a way to do this? Do I have to create a custom variable? 9BH70RO
Reporting & Analytics | | VanguardCommunications0 -
Structured Data dropped suddenly
Just noticed a large drop in Webmaster tools of our structured data graphs. Both "items" and "items with errors" dropped. It is across the board on all our sites. Even checked some of the sites that I do consulting work for, and they dropped. My assumption is that this is another Google glitch, similar to what we saw last year, and in March of this year, where is corrected itself. Anyone else seeing anything on their end?
Reporting & Analytics | | tdawson090 -
URL String Tracking Question--Need help!
I am doing some research for a freelance project and found a URL receiving a decent amount of traffic from search with this url string after the normal page url (xxx.com/credit-card-counseling.aspx?match=e&query=debthelper.com&id=22097628847&id=1810807655) Is this some sort of GA tracking code? Why would it be used on that page to track organic search hits if that happens automatically? Would love some help figuring this out! Thanks,
Reporting & Analytics | | RickyShockley0 -
GATC New code structure?
Hi Guys, One of my clients has this code showing up, and it is receiving data! Is this normal? First time I ever see such a code! Thanks for clarifying!
Reporting & Analytics | | Ideas-Money-Art0 -
Get a list of robots.txt blocked URL and tell Google to crawl and index it.
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list. My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches, One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file. I need urgent recommendation as I do not want to see drop in my traffic any more.
Reporting & Analytics | | csfarnsworth0 -
Conversion rates by browser & OS - any feedback/experts/experience?
Hi, Ive been evaluating conversion rates by operating system and by browser for a client. Ive picked up significant and somewhat disturbing trends. As you'd expect the bulk of traffic is coming from a Windows/Internet Explorer combination. This is unfortunately one of the worst combinations (Windows/Firefox & Windows/Safari did worse. Chrome/Windows was significantly the best combination with Windows). Windows also performs much worse than Mac. E.g. Windows/Firefox performs worse than Mac/Firefox. Overall conversion rate for Mac is 7.07% compared to 5.69% Windows. This is based on hundreds of thousands of visits and equates to tens of thousands of dollars difference in revenue. Generally later versions of browsers perform better on both main operating systems e.g IE 9.0 converts at 6.33% compared to 8.0 at 5.80% on Windows and Firefox 4.01 on the Mac converts at 7.57% compared to 3.6.16 at 6.54% (although this dataset is smaller than Windows/IE). Page load speeds (recorded in the clients analytics) are significantly faster on Mac than Windows (as expected really). Being Windows/IE and specifically Windows IE8 represents the bulk of traffic should we be addressing this? Will any optimisation negatively affect better performing Mac/Browser combinations? Understanding that Mac users equate to 'better' converting visitors - what else could be done there? Anyone have thoughts or experience on optimising pages for improved conversion rates via IE and Windows? Thanks in advance, Andy
Reporting & Analytics | | AndyMacLean0