Thanks Hilary - I'm concerned that if I do that, I would lose the http version as well.
Posts made by BostonWright
-
RE: How can I get Google to forget an https version of one page on my site?
-
How can I get Google to forget an https version of one page on my site?
Google mysteriously decided to index the broken, https version of one page on my company's site (we have a cert for the site, but this page is not designed to be served over https and the CSS doesn't load).
The page already has many incoming links to the http version, and it has a canonical URL with http. I resubmitted it on http with webmaster tools. Is there anything else I could do?
-
Google Alert not working - anyone else have this problem?
I have a Google Alert that has stopped pulling in recent results even though a web search indicates that the pages are being indexed. None of the alert settings have changed. Anyone else have this happen recently and know how to remedy this problem?
-
Any official stats on authorship impact on SERP click-throughs?
I'm trying to make another push to get authors on my site to set up authorship. Can anyone point me in the direction of any official stats from Google on the impact of authorship on SERP click-throughs? I've seen some articles on SEO sites that make it seem like the lift is anywhere from 20-40%, but those seem to be site specific cases. I'm wondering if there's any broader studies out there or official data from G.
Thanks!
-
Transition to new CMS - Moving homepage - any tips?
We've been transitioning a large site to a new CMS for the last few months and we are finally getting ready to move the homepage and high level sections. Are there any good articles or tips for this portion of our migration?
URLs will be staying the same so redirects aren't needed, we're recreating all the existing metadata in the new CMS, we are rebuilding our sitemaps in the new platform, etc.
I wasn't sure if there were specific things I should pay close attention to for the homepage & section fronts that are different from other pages on the site when migrating.
Thanks in advance!
-
Google Webmaster Tools not displaying all backlinks
I was asked to take a look at a site that a friend redesigned and immediately plummeted in rankings. It was a flash site where the domain redirected to a flash landing page. When the site was redesigned, they removed the flash page and put in the new site on the domain URL without changing the redirect or putting a 301 in place on the old flash landing page URL.
This site didn't have many backlinks to begin with, but it did ok for some important search phrases. Now when I look in Webmaster Tools, there are only 2 backlinks showing. Meanwhile I know there are more backlinks in existence, primarily from yellowpages and citysearch.
I've had them put a 301 on the old flash landing page pointing at the domain URL and they've added a canonical URL. What have I missed here?
-
RE: How do I get 2 column Google sitelinks instead of one line sitelinks?
Thanks. I had a feeling this was the case, but I just wanted some confirmation.
-
How do I get 2 column Google sitelinks instead of one line sitelinks?
Currently, if you search for my site's brand name on Google, we are the top result. However, rather than having 2 columns of sitelinks, there is just one line of 4 sitelinks. When you search for the site's domain (sitename.com), you get the full 2 columns of sitelinks. Are there any strategies for getting the 2 columns on more than just the domain name search? At the very least, I'd like to get 2 columns to appear when you do a brand name search, but it'd be great to get 2 columns of sitelinks for our top search queries as well.
Thanks for the advice...
-
RE: What's the best way to manage content that is shared on two sites and keep both sites in search results?
Does a duplicate content penalty impact specific pages or entire sites? If I wanted to test using the cross-domain canonical on a certain section of my site, would the impact be visible? Or would I need to put cross-domain canonicals on everything appearing on both sites in order to see the results?
-
RE: What's the best way to manage content that is shared on two sites and keep both sites in search results?
If I used the cross-domain canonical, would that mean that one site would stop appearing in search results?
-
RE: What's the best way to manage content that is shared on two sites and keep both sites in search results?
Changing the articles or even page titles is not an option.
-
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
-
Building links to reviews: Movies, Restaurants, etc
My site has lots of reviews: Movies, TV, Music, Restaurants, etc. I'm looking for sites that take critic reviews and link back to the critic's site. We're already on Metacritic.com and Rottentomatoes.com, which seems to be the two biggies for movies & tv. But I'm sure there are more out there. Also, I'd love to find sites for restaurant reviews (specifically in the Boston area). Are there any lists out there of sites like these? I'm NOT looking for UGC sites that allow user reviews. I'm looking for sites that accept professional critic reviews, preferably via some sort of automated feed.
Thanks Mozzers!
-
RE: Is it safe to not have a sitemap if Google is already crawling my site every 5-10 min?
Thanks Robert. As you surmised, our URLs are not changing (thankfully!). Fortunately, for now, our Google News sitemap still works. The only arguments I've come up with so far are:
- Having a sitemap will help SEs recrawl updated stories faster.
- Having a sitemap will help SEs find out when a URL has changed.
In my experience, Google does not index changes to existing pages as quickly as newly published articles. My thinking is that if we supply the changes via sitemap, reindexing speed will improve.
Thoughts?
-
Is it safe to not have a sitemap if Google is already crawling my site every 5-10 min?
I work on a large news site that is constantly being crawled by Google. Googlebot is hitting the homepage every 5-10 minutes. We are in the process of moving to a new CMS which has left our sitemap nonfunctional. Since we are getting crawled so often, I've met resistance from an overwhelmed development team that does not see creating sitemaps as a priority. My question is, are they right? What are some reasons that I can give to support my claim that creating an xml sitemap will improve crawl efficiency and indexing if we are already having new stories appear in Google SERPs within 10-15 minutes of publication? Is there a way to quantify what the difference would be if we added a sitemap?
-
RE: Query strings in Canoncials URLs
Would it be a problem if I used a canonical with a query string but did not change the video URL? A new tracking system we'd like to use is based off of the canonical URL. So for tracking reasons we want to append the canonical URL with the video's unique ID. Making this change is easy, but to change the actual video URLs will require a bit more dev work. Is this going to cause problems?
-
Query strings in Canoncials URLs
Video on my site all resides at www.mydomain.com/video in a player that does not assign unique URLs for each video. We may be able to rewrite the URLs to include a unique identifier found in the video's metadata (www.mydomain.com/video/?bctid=17769780). If I did this, how would it impact the canonical URL? Do the SEs accept canonicals with query strings? What if I only changed the canonical URL and did not change the video's URL? Would that be a problem?
-
Trying to determine if either of these are considered cloaking
Option 1) In the browser, we use javascript to determine if you meet the redirect conditions (referrer not mydomain.com and no bypassing query-string). If so, then we direct your browser to the subdomain.mydomain.com URL. Googlebot would presumably get the original page.
Option 2) In the browser, we use javascript to determine if you meet the redirect conditions. If so, we trigger different CSS that hides certain components of the page and use javascript to load in extra ads. Googlebt would get the unaltered page.
In both scenarios the page content does not change. However, the presentation is different. The idea is that under certain conditions users are redirected to a page with more ads. The ads are not too severe on the redirected page and will not cause an above the fold penalty. That said, will either option be considered cloaking by Google?
-
Google Trends Hot Searches hourly historical data?
Is there a site that archives Google Trends' hourly Hot Searches data? I'd like to see if specific keywords were trending at a specific time yesterday. Is this data out there? Is there a different site I should be using for this info?
-
RE: What are the SEO best practices for infinite scrolling?
We actually didn't end up using infinite scroll. But, if I were to use it, I'd probably just create a static html page that periodically updates throughout the day and use it as the canonical URL and for internal linking purposes. I know it's not ideal, but it's a solution I've used in the past with decent results for dynamically generated pages.
-
Www.sitename.com or sitename.com?
A client of mine's site is currently sitename.com. www.sitename.com redirects to the non-www URL and the canonical is using the non-www URL. This is a fairly new site and there aren't many existing inbound links. Is there a benefit to switching this?
-
In mobile searches, does Google recognize HTML5 sites as mobile sites?
Does Google recognize HTML5 sites using responsive design as mobile sites?
I know that for mobile searches, Google promotes results on mobile sites. I'm trying to determine if my site, created in HTML5 with responsive design falls into that category. Any insights on the topic would be very helpful.
-
RE: Best practices for migrating an html sitemap? Or just get rid of it all together?
any thoughts on the impact of removing the internal links in the sitemap? will this hurt our domain authority? Or given the low amount of links compared to our whole link profile, is it not that significant to cause concern?
-
Best practices for migrating an html sitemap? Or just get rid of it all together?
We are migrating a very large site to a new CMS and I'm trying to determine the best way to handle all the links (~15k) in our html sitemap. The developers don't see the purpose of using an html sitemap anymore and I have yet to come up with a good reason why we should migrate rather than just get rid of the sitemap since it is not very useful to users. The html sitemap was created about 6 years ago when page rank sculpting was a high priority.
Currently, since we already have an XML sitemap, I'm not sure that there's really a need for a html sitemap, other than to maintain all the internal links. How valuable are the internal links found in an html sitemap? And will it be a problem if we remove these from our link profile? 15,000 links sounds significant, but they only account for less than .5% of our internal links.
What do all you think?
-
RE: SEO value to Reddit backlinks?
From what I can tell, Reddit nofollows links by default and sets them to follow if people vote them up. Any truth to this? I haven't been able to tell what the tipping point is though.
-
SEO value to Reddit backlinks?
Can someone point me to an authoritative article on the value of Reddit links? I'm trying to determine if it's something we want to put effort into doing on a regular basis. Are reddit links followed? It seems like some are and some are not.
-
RE: Benefits to having an HTML sitemap?
Our html sitemap is broken down into different pages that never contain more than 250 links. All pages are linked via the top nav back to the homepage and to their section/subsection.
The issue I'm having is not that they don't know how to recreate our html sitemap in the new CMS. It's that they don't believe it serves a purpose. And given limited resources, they don't want to work on this in favor of other more crucial work.
My biggest concern is the removal of thousands of internal links. Should I be worried about this?
-
Benefits to having an HTML sitemap?
We are currently migrating our site to a new CMS and in part of this migration I'm getting push-back from my development team regarding the HTML sitemap. We have a very large news site with 10s of thousands of pages. We currently have an HTML sitemap that greatly helps with distributing PR to article pages, but is not geared towards the user. The dev team doesn't see the benefit to recreating the HTML sitemap despite my assurance that we don't want to lose all these internal links since removing 1000s of links could have a negative impact on our Domain Authority.
Should I give in and concede the HTML sitemap since we have an XML one? Or am I right that we don't want to get rid of it?
-
RE: Would I be safe canonicalizing comments pages on the first page?
I think I decided to use the view-all page as the rel=canonical.
http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html
-
Would I be safe canonicalizing comments pages on the first page?
We are building comment pages for an article site that live on a separate URL from the article (I know this is not ideal, but it is necessary). Each comments page will have a summary of the article at the top. Would I be safe using the first page of comments as the canonical URL for all subsequent comment pages? Or could I get away with using the actual article page as the canonical URL for all comment pages?
-
Can anyone recommend Boston/New England SEO conferences/events?
One of my goals for 2012 is to network more, so I'm trying to find some SEO conferences to attend this year. Unfortunately I don't have much of a travel budget, so I'm looking for events/conferences in the Boston/New England area. I'm open to both large conferences and small local events. Are there any Boston specific groups I should join?
-
What's the best keyword tool for discovering regional/metropolitan area keywords?
Generally I use the Google Keyword Tool for my keyword research, but given the fact that the data is either country specific or global, I was wondering what others use for regional/dma-specific keyword discovery. Regional traffic is very important to my site, so I'm hoping to find a tool that I can use to find keywords germane to my audience.
-
Building backlinks for a newspaper site?
Anyone have tips on building links to a newspaper site? I am looking for a strategy to create links to a news site beyond building one link at a time. We have so much content that is constantly updated that it would be impossible to do this one page at a time.
-
Does anyone have tips for optimzing a daily deals site?
I'm looking into ways to optimize a daily deals website. Any best practices or tips out there other than promoting on social sites?
-
RE: How do you get photo galleries indexed on Google News?
Thanks Casey. I do not have a problem getting indexed on Google. I'm specifically referring to Google News indexing. I have yet to find any specific recommendations for getting photo galleries indexed on Google News.
We always have ALT tags filled out and we are getting better at naming our files with keywords, although we do have some CMS limitations in regard to file names. We do a pretty good job with writing captions and including keywords, we always specify the image size (height/width/file size), attribute images, and optimize internal anchor text with at least one major keyword phrase.
My real problem is that we do all of this for all of our photo galleries yet only some get indexed in Google News. The one thing we do not do is include photo galleries in our sitemap and I'm making that change this week. Hopefully that does the trick.
-
How do you get photo galleries indexed on Google News?
I work for a news site and some of our photo galleries get indexed by Google News while others never do. I'm trying to determine why some are more successful than others even though they all follow the same guidelines regarding keyword-rich headlines & copy, h1s, etc. When comparing what's been indexed in the past with current galleries, there doesn't appear to be an obvious pattern. Can anyone share some insight into this?
-
Search bots that use referrers?
Can someone point me to a list or just tell me specific search bots that use referrers?
-
RE: What are the SEO best practices for infinite scrolling?
this is for the homepage of a news site that publishes about 200-300 pieces of content per day. You'll be able to navigate to this content in other ways on the site, but we're toying with the idea of an infinite scroll for the homepage. I'm trying to figure out a way to make the homepage fully indexable.
-
What are the SEO best practices for infinite scrolling?
Is infinite scrolling bad for SEO? Is there a way to implement infinite scrolling without hurting a site's SEO?
-
Restaurant menu SEO: PDF or HTML?
Is it better to use a PDF or hard code restaurant menus (or any document for that matter) in HTML? I want the content to be indexed and thought PDF was the way to go for several reasons, but I wanted to get confirmation on this before I move forward.
-
RE: Is it ok to use encoded special characters in meta titles?
Thanks that was very helpful!
-
RE: Is it ok to use encoded special characters in meta titles?
Sorry for not being more specific. I meant, is it ok to use { in place of { and [ in place of [. I am more concerned with using hard coded special characters having a negative impact on rankings than title length. I've read that Bing tells webmasters not to use <>{} but I've seen elsewhere that these are ok on Google. Would hard coding these make their use acceptable on all SEs?
-
Is it ok to use encoded special characters in meta titles?
I've read blog posts stating that encoding special characters in title tags is both ok and not ok. Any definitive answer out there?
Do the extra characters from adding encoding count towards the total number of characters that Google displays in SERPs? Or do they just count as one character?