If the main category is Digital Marketing then I would have the URL be /digital-marketing/. I think its important to consider how you plan to use the category in the future and build for that so you don't have a funky structure in the future and/or have to do a bunch redirects to fix it. I understand that's not always possible and things may come up you hadn't considered.
- Home
- Schwaab
Schwaab
@Schwaab
Job Title: SEO Specialist
Company: Schwaab, Inc
In-house SEO for Schwaab, Inc. I work to develop eCommerce business and brand awareness.
Favorite Thing about SEO
Technical SEO
Latest posts made by Schwaab
-
RE: Recommended URL Structure
-
RE: Recommended URL Structure
I would have your URL represent your site architecture. If Digital Marketing is a subcategory of Marketing I would have the URL structure represent that by using example.com/marketing/digital/...
If you plan on adding more subcategories at a later date it will save a lot of headaches by just having your URL structure represent your site architecture.
-
RE: High Number of Crawl Errors for Blog
It is true that you will most likely not be penalized for these pages, Google is pretty good at figuring out common canonicalization problems in my opinion and would most likely not penalize you for having duplicate content. I would encourage you to dig a little deeper and see what additional problems these pages could create though.
Consider that Google will waste valuable crawl bandwidth crawling these meaningless pages, rather than focusing on the important content you want them too. If Google is crawling them, you can most likely bet that PageRank is flowing through these pages as well, diluting the link equity of your site.
Are you using Wordpress? There are a lot of great plug ins that can help you manage these pages. You could control how Google crawls these pages with your robots.txt, by placing meta robots tags on the pages using a plug in, or by placing rel=canonical tags on the pages pointing back to the page that is the original source.
-
RE: How do I set a filter in Google Analytics?
If it is showing as direct traffic you would have to filter out specific IP addresses. Direct traffic doesn't have a source to filter out.
-
RE: Mystery 404's
I crawled your site and didn't see the 404 errors.
I did notice that your sitemap in your robots.txt 404's so you may want to take a look at that.
-
RE: Mystery 404's
Are you seeing these 404s in Webmaster Tools or when crawling the site?
If WMT where does it say the 404 is linked to from? Click on the URL with the 404 error in WMT and select the "Linked from" tab.
Crawl the site with Screaming Frog and your user agent set to Googlebot. See if the same 404 errors are being picked up and if so, you can click on them and select the "In Links" tab to see what page the 404 is being picked up on.
I checked the source code of some of the pages on www.kempruge.com and didn't see any relative links which usually create problems like this. My bet is on a site scraping your site and creating 404 errors when they link back to your site.
-
RE: Custom Wordpress Theme - HTML5 Outline - H1 display: none
You are essentially cloaking a keyword rich H1 tag. I would not do this as it is against Google Webmaster Guidelines.
-
RE: Hash URLs
I misunderstood you before, I thought you meant the old URLs had the anchors.
You are correct, technically the tabs are not unique pages. You would have to redirect each of the previous pages to http://www.teapigs.co.uk/tea/matcha_shop rather than to the anchored URL.
Having content under tabs may limit your ability to rank for a variety of keywords. For example, if previously there was a page ranking for "What is Matcha?", it may now be difficult to rank for this term because there is no longer a unique page dedicated to the topic. You lose the ability to have a unique URL, Title Tag, Meta Description, H1, and so on.
-
RE: Hash URLs
Is the content technically on one page (ww.website.co.uk/product) and just being displays based on the anchor in the URL?
Has Google indexed the anchored URLs? In my experience Google does not index anchored URLs.
I'd love to see an example to see how it is coded; however, if they are just anchored URLs displaying content that is all located on one page, the products page, then the products page would be the only page you can redirect. Technically, anchored URLs are not unique pages.
If the content is being generated with AJAX and your developers are using the hashbang method to serve a unique URL, I don't believe you would see the hash in the URL.
Best posts made by Schwaab
-
RE: Wordpress: Should your blog posts be noindex?
You blog posts should be indexed and followed. It may just be the default so development pages do not get indexed by search engines.
I would recommend using no index tags on the various category, archive, and tags pages that are created when using a WP blog. These pages can lead to a lot of duplicate content problems.
-
RE: The risk of semi-hidden text, which only shows-up when page viewer clicks button.
Check out this video: http://www.youtube.com/watch?v=EsW8E4dOtRY
Summary: If you're not trying to stuff hidden text in there then don't worry about it, it's a normal thing on today's web.
-
RE: How to see organic traffic only?
Are you properly tagging your PPC destination URLs? If these URLs are not properly tagged some of the paid traffic coming in may be attributed to organic.
-
RE: Keyword Difficulty Showing ONLY Bing Search Volume (Exact Match)
The "Google US" means that they are analyzing the top results on Google to determine the keyword difficulty.
Google does not let people use their AdWords API (where Moz would pull search volume from) if they violate guidelines. One of the guidelines was that you cannot scrape search results (which Moz does to determine Google US difficulty). I believe the keyword search volume was switched to Bing data work around this issue. I do not believe there is a way to have the Google search volume display in the keyword difficulty tool.
Any one feel free to correct me if I got any of that wrong.
-
RE: High Ranking site with very low amount of texts, HOW?
I've done a little SERP analysis recently to help figure out how much content I really needed to rank well. What I found was that for the queries I researched sites that were ranking in the top ten either had a low PA/DA and a ton of text (2k + words on the page) or a high PA/DA and a low amount of text (~300 words). Obviously sites that had a high PA/DA and a ton of text crushed it in the SERPs. This all goes back to classic correlation does not equal causation.
There seemed to be a point where a page with low PA/DA could still rank well if you had enough content, and vice versa. I would think if the site you are referring to has minimal text on the page its backlink metrics are pushing it to the top of the SERPs. Also, good UX will improve time on site which may help it surface to the top of the SERPs.
Again, correlation does not equal causation. This is just something I've seen for a sample of keywords I am targeting. I've used this to determine which landing pages on my site have low PA and have worked on adding some relevant text to those pages.
-
RE: Test site is live on Google but it duplicates existing site...
I would add some no index, no follow tags to all of those pages as well just to be safe. I've had Google index some stuff I've had blocked via my robots.txt in the past. Also, make sure if you update your sitemap that you don't accidentally include these test URLs.
-
RE: Is having two blogs bad?
I wouldn't intentionally create duplicate content. I would talk with your company and stress the importance of avoiding duplicate content creation.
As for where to host the blog. I would recommend hosting it on your site if possible. Links to the blog posts will benefit the domain it's hosted on. The only way to really benefit your site would be to link to pages on your site from within blog posts; however, that would not be as beneficial as hosting the blog on your company's site and building links directly to your company's domain.
-
RE: Big Jump in Domain Rank
I believe Moz had some issues with their linkscape index sometime over the past few months. The metrics for the last update were put together using a smaller sample size of links, meaning the numbers were down across the board for most sites. The problems were corrected and now the metrics are being calculated using a larger sample size of links, meaning the metrics went back up.
-
RE: I want to recrawl my site manually
As part of your campaign you can't; however, if you want to run a crawl check out http://pro.moz.com/tools/crawl-test. It won't update your campaign but is a great way to get information from your site.
In-house SEO for Schwaab, Inc. I work to develop eCommerce business and brand awareness.
Looks like your connection to Moz was lost, please wait while we try to reconnect.