I guess not. What do you mean by "indexed differently"?
Posts made by FedeEinhorn
-
RE: How to use canonical with mobile site to main site
-
RE: What constitutes a duplicate page?
While you visit those pages that SEOMoz tags as duplicate, is the content duplicate? If it isn't, then there's nothing to worry about.
We have duplicate content notices too, and those are usually tag pages that at a certain moment have the same posts within the listing as all those posts use the same tag.
It would be great if you post a couple pages that are reporting duplicate content and where it can be found so we can take a look at that.
-
RE: How to use canonical with mobile site to main site
If the content is the same, within the desktop and mobile version yes. The rel=canonical only points the search engine about which page should be indexed. As the content is the same, indexing the main (desktop) page should do it, as you would need to redirect mobile traffic to the mobile version once they click in the result.
Hope that helps!
Here's a video from Matt Cutts about mobile content:
-
RE: Where To Add New Content In Site URL?
Agree, subdomains are treated by search engines as different domains, then no benefit goes from subdomain.domain.com to www.domain.com.
If the new content that you are creating has to do with www.domain.com's content, then it should be on a subdirectory.
-
RE: What constitutes a duplicate page?
My first guess is: if the information of the page is updated because the previous details are no longer valid, why not removing the old page entirely?
Anyway, removing or leaving the info there shouldn't cause any problem, the content isn't the same. But I guess for some days data does match previous dates, therefore my idea of removing the old (useless) time tables.
-
RE: Why is google webmaster tools ignoring my url parameter settings
Allow a couple of month to see the changes. It they were recently made, Google will take a while until removing duplicate content errors.
-
RE: Updating existing content - good or bad?
If you are just updating the title, or rewriting the content, then I would go with the same page instead of creating a new one.
IF-MODIFIED-SINCE is the way of telling spiders that the content has/hasn't changed. You can read more here: http://www.feedthebot.com/ifmodified.html
-
RE: How come a site with low PR and DA is first on everything?
I find Moz numbers much more accurate than Google rankings.
There could be several reasons why other site is ranking higher if you have done all the right things. The most common one is link spam (which will backfire to them later).
I've seen several of those cases that you mention, and usually are pages that come up, stay for 1 - 6 month in the first places and then disappear.
The real problem is that usually, the person behind that page does the exact same thing with another Website, while you keep loosing money. Behind all that, Google, which apparently can't sort that thing out.
You are only left with reporting those sites and, if you are really lucky, someone from Google will look into it and take the appropriate action.
-
RE: Updating existing content - good or bad?
Matt Cutts from Google pointed out in a WH video that you should update instead of creating new pages with only the updates.
You can point in the old page that the content was updated using "IF-MODIFIED-SINCE".
I can't find the video right now, but I am sure he did say that
-
RE: Google not pulling my favicon
The favicon.ico is named like that, in the root and accessible both with SSL and without.
Google used to show it, but a few weeks ago I noticed the issue while I was posting a link to my site in G+.
-
Google not pulling my favicon
Several sites use Google favicon to load favicons instead of loading it from the Website itself.
Our favicon is not being pulled from our site correctly, instead it shows the default "world" image.
https://plus.google.com/_/favicon?domain=www.example.com
Is the address to pull a favicon. When I post on G+ or see other sites that use that service to pull favicons ours isn't displaying, despite it shows up in Chrome, Firefox, IE, etc and we have the correct meta in all pages of our site.
Any idea why is this happening? Or how to "ping" Google to update that?
-
RE: In OpenSiteExplorer, why is HTTPS not an option?
Quoting Megan from SEOMoz:
"Hi Peekabo!
This is Megan from the SEOmoz Help Team. Unfortunately, OpenSiteExplorer does not support https at this time. Sorry about that! This also means that any metrics that are gathered by our Mozscape Index (the index for OSE) are going to be for the http version of the site, which includes Page Authority. We hope to support https sites in OSE at some point, but it's probably still a ways out. We'll update this feature request thread as soon as this feature is available, so you can keep an eye on it: https://seomoz.zendesk.com/entries/20770156-open-site-explorer-to-crawl-https
_If you have any other questions about our tools, feel free to send us a message to help@seomoz.org. _
Cheers!"
-
RE: How Do I Generate a Sitemap for a Large Wordpress Site?
In my case, xml-sitempas works extremely good. I fully understand that a DB solution would avoid the crawl need, but the features that I get from xml-sitemaps are worth it.
I am running my website on a powerful dedicated server with SSDs, so perhaps that's why I'm not getting any problems plus I set limitations on the generator memory consumption and activated the feature that saves temp files just in case the generation fails.
-
RE: How Do I Generate a Sitemap for a Large Wordpress Site?
I would go with the paid solution of xml-sitemaps.
You can set all the resources that you want it to have available, and it will store in temp files to avoid excessive consumption.
It also offers settings to create large sitemaps using a sitemap_index and you could get plugins that create the news sitemap automatically looking for changes since the last sitemap generation.
I have it running in my site with 5K pages (excluding tag pages) and it takes 10 minutes to crawl.
Then you also have plugins that create the sitemaps dynamically, like SEO by Yoast, Google XML Sitemaps, etc.
-
RE: Multilingual Website - Sub-domain VS Sub-directory
According to Matt Cutts, if you don't want to get all those localized domains, such as domain.it, domain.es, etc., then you are better of using subdomains.
The the video: http://www.youtube.com/watch?v=GyWx31GeQWY
But it is really up to you. Make sure whatever route you take, add each subdomain / domain/folder to WT, that way, you can target specific markets for each language.
Hope that help!
-
RE: Affilate Programs on Subdomains
As far as I know, a 301 redirect passes pagerank to the final destination, therefore if you get too many bad links it will affect your main domain rankings.
I would suggest you to create javascript tags that your affiliates can use to display banners, and links to your site and adding the nofollow attribute to those links created by the js.
There's a Matt Cutts video where he explains how 301 redirects work:
-
RE: How do I find and can I trust link building companies
Here's the link that Fracisco refers to (I guess): http://www.seomoz.org/article/recommended
-
RE: Need help creating sitemap
I've been using xml-sitemaps software for years now, their paid software is great. If you are running PHP based server, you just add the files to one directory and follow the installation steps. It will not only index the entire content, but you can also set limits on how many links per sitemap, they offer addons that automatically creates images/video sitemaps, news, etc. The software also creates an HTML version of the sitemap. And the XML version uses CSS so you can read it with ease.
I would go with them, the software even gives you info on how to submit the sitemaps to Google, sitemap index, etc. Then you can also create a link in the robots.txt so spiders know where to crawl:
Something like:
User-agent: * Sitemap: http://www.example.com/sitemap.xml
-
RE: Are flip books - pdf readers on websites SEO friendly?
Oh, just spotted this video that refers to your question:
-
RE: Sitemap indexing problem
Sitemaps from xml-sitemaps are completely reliable. Google does not index all pages of your site and it also doesn't show the exact amount of indexed pages, that way the prevent users from using shady tactics to fool the system.
In my case, with about 4K pages submitted to Google via sitemap, about 3.5K are indexed (that's what it says in WT) but when performing a site:example.com search it shows about 5k results. So as you can see, those numbers that Google show are not exact, those are just estimates. If you submit 200 pages via the sitemap and google indexes only 10, then there's a problem.
As per your traffic change, you should look on another possible reasons, ans it could also take some time to reindex all your pages.
-
RE: If parent domain is www, does it matter if subdomain on a different server is non-www?
Basically, www domains are a subdomain of example.com. You don't need to have your subdomains like www.service.example.com, if that was what you were asking. That's just poorly structured URLs IMHO.
There shouldn't be any different for edu or gov domains.
-
RE: Do sitewide links from other sites hurt SEO?
I would definitely remove those links. They were unnaturally created. Although you are friends and you are not giving him anything in return (that you mentioned), do those links have any value for his visitors? I mean, when their users are on his site, are they likely to click on those links? If not, then not only those are useless backlinks for users but for SEO too.
-
RE: Implementing Pinterest On Site and Social Snippets
As far as I know, the pin it button uses the image you give in the attribute. And if there's no image there, it will let you choose from all the images they find in the page.
-
RE: Should I consolidate pages to prevent "thin content"
There are a couple other scripts/enhancements you can do to speed up the site:
- CDN - Loading images using a CDN (Cloudflare offers that for free).
- Image optimization
- Lazy loading the images (Also available for free using Cloudflare)
- etc.
-
RE: Logged In Only Content Made Available to Googlebot
If the content gets indexed, then it's no longer protected for others to see. They could just search for the rest in Google.
There's no point of blocking users and allowing Google to index it. Instead, you could build pages with excepts of the content, so Google AND users can see that, then users can decide to proceed as a logged in user or not.
Google offers site authentication but only for AdSense purposes. You can give them the post details they will need to access that content, but that won't change the indexing for search.
Hope that helps!
-
RE: Are flip books - pdf readers on websites SEO friendly?
Google indexes PDF contents and files almost like regular HTML. Links are followed, you can block indexing, etc. Just like regular HTML. The only thing Google can't index from PDFs are images, unless you have it in HTML format elsewhere.
I would definitely recommend converting those PDF menus to regular HTML.
You can find more info here:
What file types can Google index?
Can google fully index pdf files?
Hope that helps!
-
RE: Does Bing Business Portal still not accept Canadian listings?
Apparently there's a workaround for Canadian businesses:
http://www.stepforth.com/blog/2013/get-listed-on-bing-local-canada/
Hope that helps!
-
RE: Is there anyway to recover my site's rankings?
No problem! If you notice several bad backlinks, you could use a tool like removeem.com. Is a paid tool, but it will help you identifying those back links and contact the site owners. I hope you get it resolved.
-
RE: Is there anyway to recover my site's rankings?
There are several tools online to review your backlink profile. I would start with OpenSiteExplorer. Check all links and get those with very low authority, check those pages and see if they are related to your content, if there's no much value on them (content is unrelated, and the overall 'worthiness' is low) you could contact the webmaster asking to have that backlink removed.
-
RE: Personalization for non logged in users
I guess not, at least they shouldn't. But they were caught doing pretty ugly stuff
They could still use your IP to personalize search, there's nothing wrong there.
-
RE: Unusual activity
Try using Cloudflare. They will give you lots of information about your traffic and block any bad traffic. You could test with their free service for a while, there's no need to pay anything. And once you have it figured out, you can go back to normal, or stay with cloudflare, which won't make you any bad.
-
RE: Is there anyway to recover my site's rankings?
Well, if you have cleaned your backlink profile, and there are still links that were not removed, you could try disavowing them using google's tool and see what happens. You should only use that tool if you are sure those are hurting you and have already done everything to contact the webmasters to have those removed.
If that doesn't work, then it is your content. Are you creating worthy content? Is people accessing your content pages? sharing it? Is that content really good for users? or was it built thinking on engines? Good, worthy content will give you some backlinks, which ultimately will increase your rankings.
-
RE: Two domains, one unique design?
If they are totally different services (that you need separate domains) I would go with different themes. Not because of a SEO penalty (there's none that I am aware of) but just because of user experience. Each Website offering a specific service would work better having the design matching the service, IMHO.
-
RE: Personalization for non logged in users
The only personalization Google can make if you are not logged in and have no cookies stored would be using your IP to trace your location. There's no need to keep track of your IP. Every request you make goes with your IP address, a simple IP tracing can reveal lots of information about you: http://tracethisip.com/myip
-
RE: Is there anyway to recover my site's rankings?
First step would be to check on your competitors. What are they doing? How have they reached those rankings? Do some research on them and then compare to yours. What are the differences? Are they doing something that you aren't?