George is right, I think it might be best to ask this question here: http://forums.iis.net/
Posts made by DaveSottimano
-
RE: Mixing static.htm urls and dynamic urls on a Windows IIS Server?
-
RE: Pros or Cons of adding Schema Markup via HTML or through Webmaster Data Highlighter
Hmm, not sure about that, I've just tested it and it doesn't. Thinking about it for a second, Google would need access to your site to do that Would be awesome though.
-
RE: No Google Analytics code on page BUT reporting is active
Hi Mike,
There's some good suggestions by the folks above, but I'm assuming you're still having issues. It sounds like you might need an extra set of eyes, so if you're still stuck with this issue, I'd be happy to take a look for you. Of course, it's all in complete confidence - you can e-mail me at david.sottimano@distilled.net if you want to take up the offer.
Hopefully it's just a quirk
Dave
-
RE: Outdated Videos and SEO
Hi Rachel,
I guess this might be worth a blog post since you didn't find resources
I'm going to try and tackle your question in steps,..
1) What do people recommend to do with old, outdated videos on sites like YouTube and Vimeo
My personal opinion, but if these videos are getting views, try to direct them to the new videos using annotations and such. Unless that product video is harmful (example: it was flagged as toxic waste ;)), then why take it down if it's still getting traffic? Okay, a better example would be that they're so outdated and look embarrassing to your brand, but even then, you can cover up the entire video with an annotation to visit the new product video...
2) They show up in search, but some mention outdated products
Which search? Google regular search / video / youtube / vimeo? If it's regular Google search, you can change that by de-optimizing the listing on Youtube and get a new video ranking..
3) However, is it best practice to remove everything older and outdated from Youtube, etc. or is it better to have these in your library (quantity over quality)
There are no rules here, every case is different.
4) We also started Google Plus after Youtube, so we now have two YouTubes (one empty one attached to our new G+ and one that's been established with a lot of videos - new and old)
For simplicity, it might be better to have one account.
**
Good suggestion / idea from Microdesign, but I don't think updating videos is possible on Youtube (https://support.google.com/youtube/answer/58101?hl=en)
-
RE: Conflicting numbers in Google Analytics
One thing that stands out to me is that you asked your question on June 8th, and you were comparing 7th and 6th, so there might be a reporting delay that could cause conflicting numbers. I've seen this before, and yes, it's weird :S
Is this still happening? Can you compare other date ranges and see what you get?
Just in case, make sure you're comparing the same reports and ensure there aren't any segments applied, if you can compare the numbers in an unfiltered view, that would be even better.
-
RE: Not Provided Filter is not tracking goals
Hey,
I'd love to help, but it seems like the screenshots have been taken down. Can you re-upload to http://imgur.com/ and send the link through?
Dave
-
RE: Why isn't our structured markup showing in search results
Hi
As Dennis confirmed, the 2nd link is showing the rich snippets. Try not to rely on the site: operator, and instead (first choice), query Google specifically where your result should appear first naturally, in your target country. For example, https://www.google.com/search?site=&source=hp&q=IT+and+Management+Training+Alpharetta%2C+GA+&gl=us yields the desired result:http://screencast.com/t/szChNsHMjS42
Alternatively, you can always try the info: operator which is usually more reliable. Your first result is a bit more confusing to explain, http://screencast.com/t/nY3u6eCdeIfV. You have 2 results appearing for the exact page title query, and that might be the reason why you aren't seeing it (at least in this example): https://www.google.com/search?gl=us&q=Java%2C+Perl+and+Python+Programming+Training+&gl=us
Just as David-Kley said above, you've implemented the microdata correctly and it's really up to Google as to whether the query deserves rich snippets or not. On the up side, you are getting quite a few rich snippets, so just look around https://www.google.com/search?gl=us&q=Perl++Programming+Training&gl=us
Thanks,
Dave
-
RE: Question regarding geo-targeting in Google Webmaster Tools.
Hi there, your suggested setup is perfectly fine. You're able to, and allowed to target the root domain to a specific country, while targeting subfolders to others. The geotargeting on the main domain's URLs will be overridden if you specify a different target for subfolders, for example:
if you target www.domain.com -> UK, then
- www.domain.com/this -> UK
- www.domain.com/that -> UK
and if you target www.domain.com/us -> US, then
- www.domain.com/us/this -> US
- www.domain.com/us/that -> US
Does that make sense?
-
RE: Correctly Dealing With Redirects
Just adding to what the others have said, but there could be a number of reasons:
- WMT is slow
- You actually lost the links
- The links aren't being redirected properly, i.e. multiple hops, no redirect at all
- 302's and other redirection CAN actually show up in WMT, you'll see it listed as "via this intermediary link....
The advice the others gave is solid, you'll need to double check with some kind of HTTP response code tool to check. Please supply a URL.
Cheers,
Dave
-
RE: Enable Demographics and Interests reports using analytics.js
Hi Paul,
Okay, maybe we can take this one step at a time. Have you substituted your ga.js code for the new doubleclick.js?
There's one line in the Javascript that looks like this: ga.src = ('https:' == document.location.protocol ? 'https://' : 'http://') + 'stats.g.doubleclick.net/dc.js';
Okay, so if that's all done, head to your GA account.
- Select the profile you wish you enable demographics reports, and click on Admin (top right)
- In the middle column, click profile settings
- Enable demographics reports, and wait a day or two for data to flow in.
I see you might be using Tag manager, if so read this:
"Please note that if you use Google Tag Manager, you should select "Add Display Advertiser Support" in your Google Analytics tag template; and if you are using a 3rd party tag management tool Google Analytics might not be able to validate your code, but you should be able to skip validation and the reports will work." - Source http://online-behavior.com/analytics/demographics
-
RE: Ranking without links but through Social Media popularity
I had help from the UK seo community doing a Google +1 experiment recently - it's a bit silly but I think this is what you're looking for. http:// [www] davidsottimano [.com] /gareth-hoyle-sexiest-man-alive/ (purposely not a link!). I think it had around 40-50 +1's last time I checked.
Ranks #4-#5 https://www.google.co.uk/search?q=gareth+hoyle&gl=uk
Feel free to use it if it suits you.
-
RE: Page ranking in .com but not in .co.uk
Hey there,
Not sure I'm seeing the same SERPs as you...
Okay, for "terminology management" you rank 1st in the Uk https://www.google.co.uk/search?biw=1366&bih=683&q=terminology+management&gl=uk&pws=0
and #3 for "translation memory" https://www.google.co.uk/search?gl=uk&pws=0&q=translation+memory
Notice that I use 2 additional parameters in the search URL, pws=0 removes personalization and gl=uk tells Google I'm searching from the UK.
No offence to Maximillion here, but hreflang is not the right choice for you at the minute. If you start losing ground to UK based websites for these generic English terms, you can follow the previous advice but you'd have to make some really clever/complicated adjustments. From the current situation, you're in very good shape and you shouldn't need to do anything else.
-
RE: Experience/suggestions in redirecting old URLs (from an existing site) to new URLs under a new domain
Just to add to Alan's answer, and this is from pure experience:
- You'll likely see a "down" period. Expect organic traffic to drop for a few weeks and hopefully get back to previous level or better. If you rely on organic, up your spend on PPC during this period
- Do not redirect internal pages to the root. domainA.com/page should redirect to another inner page. Don't start redirecting groups of pages to the new root.
- Like Alan said, pages without external links are not worth redirecting. Make sure you get all of your links from OSE, Majestic, Ahrefs and WMT to get a decent picture. Ensure you're redirecting to similar content too!
- Change the site address in WMT, verify new site in WMT - watch this like a hawk, especially crawl errors.
- Once the new site is up, submit your XML sitemaps to WMT.
- Triple check your robots.txt, and other robots directives - I've see too many sites go up with NOINDEX and full robots.txt blocks.
- Fetch pages in WMT to ensure you're getting 200 response codes on your most important / new pages
- Recrawl all your old pages (that you've redirected) to ensure they return a 301 > 200. No chains, just 1 301.
There's probably more, but those are the essentials.
-
RE: Should comments and feeds be disallowed in robots.txt?
Wordpress is a funny platform, you would think that there isn't much to disallow but there probably is quite a bit. I agree with Federico - you should allow comments, feed, and rss.
I'm not going to make blind assumptions here, so you should check your log files to see what's being constantly crawled, feel free to read this http://moz.com/blog/server-log-essentials-for-seo.
FYI - This is a big job. Shout if you need help.
P.S - Hostgator's Cpanel will allow you to archive raw server logs, make sure you check that option from now on or they'll be overwritten!
-
RE: New "Static" Site with 302s
Hi Danny!
I don't have much to add here, I think the guys have it right in that you'll need to figure out how to make the 301 work. I quickly read that documentation, then realized I wasn't a robot, so I found this: http://aws.typepad.com/aws/2012/10/amazon-s3-support-for-website-redirects.html which was a bit more friendly.
I wish I could help you out more, but I'm not using AWS. I'm assuming you'll be able to use wildcard or regex matching somewhere, and that should solve your problem.
Great site by the way, anything you're using to help out with the static blog? (Jekyll, Octopress?)
-
RE: Using advance segments or primary dimensions?
John Barth has the right answer below. Not sure if I can add much more, except specific instructions:
When you apply advanced segments to reports in GA, the report is automatically sampled. In the top right corner of the interface, you'll either notice a message in a yellow container, or a square that looks like a QR code. The visit counts may never match up exactly, but if you adjust the precision of the report to it's highest level, you'll get much closer.
Explanation from Google documentation:
Apart from the standard reports, users may issue ad-hoc queries to Google Analytics. Common queries include applying advanced segments to standard reports, applying a secondary dimension, or running a custom report. When the front-end issues a query, GA inspects the set of pre-aggregated tables to determine whether the query can be wholly satisfied by existing aggregates. If not, GA goes back to the raw session data to process and compute aggregate data on-the-fly. If the resulting report is sampled, you will always see a yellow box at the top of the report which says: This report is based on N visits.
-
RE: HTML5 & the doc outline algorithm
HTML5 allows for multiple h1's in header groups - it's totally fine. Read more here: http://www.distilled.net/blog/seo/fixing-seo-problems-with-html5/
So is having more than one h1 on a page with the same value going to have a negative SEO impact?
<header>
If I have too much of the same copy on the page
</header>
Is this unique enough?
<header>
If I have too much of the same copy on the page
</header>
If the text in both of the two H1 tags makes your page more than 70% duplicate (that's an arbitrary number btw), I'd say you'd have an issue. I suspect this is going to be very unlikely and that you won't have a problem. I don't fully understand why you'd want to repeat the same text, but I'd encourage you to use variants of the keywords in the top h1.
Good luck!
-
RE: How to remove an entire site from Google?
Hmm, I don't quite understand. If you bought the website, wouldn't you want to take it's authority and redirect it to your primary site?
Re: subdomains > You can use the same WMT account to add and verify the subdomains.
Re: Noindex tag > This all depends on how often Google crawls the site, there is no minimum of maximum amount of time it will take. I'd say that one year is an edge case, and there were other factors at play (i.e. the noindex pages were orphaned).
-
RE: How to remove an entire site from Google?
The quickest way to remove an entire site is:
A: Block everything in robots.txt - add this line into your file:
User-agent: *
Disallow: /
**B:** Set up and verify webmaster tools, then go to your dashboard, select Optimization
(left menu) > Remove URLs > Create new removal request.
When it asks you to enter a URL, just specify "/" (without quotes to signal the root).
Note! You'll have to repeat the verification / removal process for all of the subdomains as well
That *should* knock out your site within a few days.
In the unlikely event that doesn't work, do this:
**A:** Remove the robots.txt block
B: Add the meta robots NOINDEX tag to each page
**C:** Once the pages are completely gone (use site:example.com to check),
put the robots.txt block back on.
**Question for you:** Why exactly are you doing this? There might be a better
solution for added SEO benefit if you explain why...
Cheers,
Dave