Takeshi,
Do you think I could hurt my site if I embed the old code between the <iframe>tags?</p> <p>So,</p> <p><iframe codetothevideo>oldembedcode</iframe>
Thanks again,
Daniel
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Takeshi,
Do you think I could hurt my site if I embed the old code between the <iframe>tags?</p> <p>So,</p> <p><iframe codetothevideo>oldembedcode</iframe>
Thanks again,
Daniel
Hi Takeshi,
Thanks again today for your answer.
This rule that google will not show video rich snippets for youtube/vimeo also applies for videos hosted in a vimeo pro account with community pass deactivated?
Best Regards,
Daniel
Dear All,
I read somewhere that the use of the
<noframes>tag when embending videos using iframes could be seen as cloacking, because you are delivering something to the people, and something "different" to the search engines.</noframes>
For compatibility with IOs devices, iframe is the way to go with youtube videos, also with vimeo, but so far, any of my videos has been indexed by google.
So, in your experience, what advice would you give to embed videos using those services (youtube, vimeo pro)? Would google index them if you place the old embed code between the <iframe></iframe>tags, without using
<noframes>? Would you say that, without the use of noframes, one is doing cloacking?</noframes>
I want to get my videos indexed, but I do not want to be penalized by google. Technically the use of noframes could be taken as cloacking, but if the content of both videos is the same, then it is not. What do you think?
Thanks for your help,
Daniel
Hi Takeshi, thanks for the asnwer again.
Would it prevent the deleted/expired pages to be shown as soft 404 in the Webmaster tools?
Hi Again,
@Takeshi Young: Thanks for your answer.
I will try to explain what is happening a little better.
We are using a CMS for Classifieds adds. The script is able to give "SEO Friendly" URLs, which are based in mode_rewrite. If a listing has an ID number, lets say "5", that listings url will look like this:
http://mydomain.com/5-listingname/details.html
After the listing expires, the URL will not be valid anymore, and if a user try to visit the listing, the script deliver a page with a message indicating that the lising is not active anylonger. The HTTP Code is 200 "ok". If the listing is deleted, then a user trying to visit the URL will get a similar message, also with a HTTP Code 200. It is a problem, because that page should return a 404 code, indicating the search engine that the page is gone.
If a user try to visit an invalid page, like for example:
http://mydomain.com/invalidpage.html
then the system will deliver the 404 page that is set in the .htaccess file, but since the script recognises the numeric parameter in the deleted/inactive listing, it does not deliver the 404 error but a page with a message, and this page with a message is a soft 404 error, bad for SEO.
It is out of my knowlage to repair the script in order to make it deliver the proper 404 header, but I can customize as much as I want the page indicating the error.
Then I have two questions:
If I set the soft 404 error page as noindex, will it be good enough as to not being affected by the problem?
Is there any way of indicating the search engine that a page is 404, other than using the apache .htaccess? Like a tag in the head section? or any trick that would help me with this problem?
Thanks in advance for your help,
Daniel
Hi Everyone,
I get an 404 error in my page if the URL is simply wrong, but for some parameters, like if a page has been deleted, or has expired, I get an error page indicating that the ID is wrong, but no 404 error.
It is for me very difficult to program a function in php that solve the problem and modify the .htaccess with the mod_rewrite. I ask the developer of the system to give a look, but I am not sure if I will get an answer soon.
I can control the content of the deleted/expired page, but the URL will be very similar to those that are ok (actually the url could has been fine, but now expired).
Thinking of solutions I can set the expired/deleted pages as noindex, would it help to avoid duplicated title/description/content problem? If an user goes to i.e., mywebsite.com/1-article/details.html I can set the head section to noindex if it has expired. Would it be good enough?
Other question, is it possible anyhow to set the pages as 404 without having to do it directly in the .htacess, so avoiding the mod_rewrite problems that I am having? Some magical tag in the head section of the page?
Many thanks in advance for your help,
Best Regards,
Daniel
Dear all, I have two small questions for you: 1) If I have a page with many internal links, and in the head section of this page there is a noindex, follow tag, would the search engine spider the internal links placed in that page? Those internal links conduct to pages which I want the search engine to index. 2) Not related to the question above: Does it affects in any form the nofollow meta tag the way the search engine crawl internal links? I would say it onle have an effect on the exterrnal links, unless the nofollow is placed directly in the < a > tag. Thanks for your help, Daniel
Hi Sandip,
Thanks for your answer.
Could it be that, even if the crawler is not indexing the search result, is going to index the links that are in those pages? Attributes in this case: noindex, follow.
Then what at best would benefit me, is to index the first search result page, and noindex starting the second one. Indexing the first because it has the meta description, and title, and I want that people find my categories (which are search results).
So, crawler can index links in a noindex page, if it has the attritbute follow?
Thanks again,
Daniel
Dear All,
Is it a good idea to noindex the search result pages of a classified site?
Taking into account that category pages are also search result pages, I would say it is not a good idea, but the whole information is in the sitemap, google can index individual listings (which are index, follow) anyway. What would you do?
What kind of effects has in the indexing of the site, marking the search result pages as "search results" with schema.org microdata?
Many thanks for your help,
Best Regards,
Daniel
Hi!
It is easy to know if somebody is spam linking your website, looking i.e., looking at open site explorer to analyse the links profile.
But, is it possible to know if a competitor of mine is redirecting a bad domain to main with a 301 redirect, thus transfering any bad SEO reputation to me?
Best Regards,
Daniel
Hi!
My client's listing keywords are safe, but results for my domain name has let only the frontpage in good position, and other bad quality SEO pages, in which we have carried out online test (they generate a page with the name of everysite that do a test with them), has appear in the firsts google results.
If it is an update, why so many spam pages are coming up in good positions?
Update: Less than two hours later, the three sections of my site are ranking in the first three positions for my domain name. It changes a lot today.
Hi!
Some time ago, a friend of my added our site to a directory. I did not notice it until today, when in the search results for my domain name, the directory came in the first page, in the four position. My friend wrote a nice article, describing our bussiness, and the page has a doFollow link.
Looking at the metrics of that directory, I found the following:
The directory accept a free article about a company, does not review it before it is published, but look for duplicated articles representing spam; so one company can only have one listing (in theory).
Is there any formula to know if a directory is safe to publish a doFollow link? If they don't review the link I would say is not a good signal, but is there any other factors to take into account?
Many thanks for your answer!
We look for the search phrases for selecting the category names, and doing this, we found a lot of possible good categories for the site, but without dupplicate or confusing names. For example: instead of creating a category called: Consoles and Video Games, we use something more specific: video consoles second hand, video games used (actual keywords are in spanish). It is only a small example, that multiplies itself in different cases.
Looking at your site, I notice that you decide to use more general categories, and works fine!
Nice system in Gumtree! Easy to navigate and good looking.
Thanks again for your answer,
Daniel
Hi Everybody,
We are adding a new feature to my site, and it is a add classifieds section. I spend the whole weekend making a research for keywords that could fit the category names. What I did, is to search for the exact phrases that people use related to the different categories that a classified site has. For example, if the category is to sell and buy cars, I search for the most popular query, in my language: cars second hand. So far so good, but the problem is that if I do so, I will have far too many categories, because those would be too specific.
Any of my future competitors use the most search keywords, they are more pragmatic and use general wording. They also do not have more that one level subcategories, for the sake of usability and probably SEO.
Today I entered all the almost 100 categories on the system, and it looks horrible. Also, too many links in the main page, the categories are difficult to find in the drop box list, users would need to scroll down to find what they are looking for: even in the main page!
Other unwanted consequence is that too many categories will lead to too many empty categories in the first months, giving a bad impression of inactivity to the site, and making it even more difficult to create momentum.
My question is: would be there any SEO benefit for using a long list of keyword categories, that compensate the problems generated for being too specific?
Since any of our future competitors use many categories, I would answer myself as: keep it simple and short. But probably somebody with experience can give me some ideas.
Many thanks in advance for your help,
Daniel
To update a little about the issue.
The only safe way of measuring the effects of the international characters I have found so far is the google webmaster tools, in the search querys section. There one can find the ranking position for a specific keyword.
Unfortunatelly, Google treats both differently, are not the same.
In my case it is a small difference, 5 positions for a specific combination.
Thanks for your answers!
Daniel
Dear all,
Almost since we started designing our site, we are using schema microdata. It is not only because of the rich snippets, but because I want the search engines to better understand what we have.
For example, the +1 buttom would not work properly without schema microdata, because it kind of ignores the OpenGraph parameters that specified image and description; and since we are a (very small) local bussiness directory (between other things), all our clients have a hand written schema complient description on their lisings, including address, opening ours, telephone number, description, etc. It is hand written by us because the tools avialable are simply not good enough to cover all different scenarios that a listing can present.
I have not use, until today, a proper for the homepage, and it is probably the cause that our page lost the nice links below the site description in the google snippet. I did not place it on the body tag, but near the description, closing it inmediately after the description finishs. Now this is solved and we will wait to see if the links come back in the next weeks.
Now to the question. Our site has three sections, with three different systems installed, two running wordpress and a third running another script. the main site is the local bussiness directory. The front page is mark as "schema.org/WepPage", and I do not know how to mark the other pages of the main site. I was thinking of marking the listings as "schema.org/ItemPage" since they are related to specific clients. Would you consired it to be right? Then, we have landing pages for the categories, should they be mark as WepPage, or as an Article, or something else?
Many thanks in advance for your help,
Best Regards,
Daniel
Hi! Thanks for your answers.
So far our site has not dropped in the results for the keywords I tested. The only negative signals found until now are:
Dramatic drop in the number of pages indexed, according to google webmaster tools: but not for the pages that are listed in the sitemaps, those were untouched. Drop pages should be mainly duplicated content created by the CMS before we started using canonical tags (not 100% sure which are those dropped pages).
Clean up of keywords related to my site. When we installed a new CMS theme for a subsection of our site few months ago, the theme had post "examples", which I overlooked and google indexed those posts very quickly. For a long time I had "lorem ipsum", etc.. as keywords in the webmaster tool list. Now, they are gone, and the real keywords are still there.
I do not know what would happen in the next days with the search results of my good keywords, I hope they mantain their position, or improve it Let`s see...
Thanks again,
Daniel
Hi Don,
Thanks for your answer. Sometimes keyword tools shows the same volume, and google trends shows different number of search for terms with and without accent. I try to avoid those words when possible.
Should one use the wrong word for difficult phrases for the sake of good search results? It looks unprofessional
Dear All,
Using the keyword analysis tools, we found an interesting result: for one of our listings, which use in one word a spanish written accent "fotografía with í and not i", the report give us a "F" if the keyword is written without accent, and an "A" if it is written with the accent (we use the proper written word in the content and title).
It is only a SEOMoz tool related issue, or google take a word with accent as a different word? Most people write in the search engine without using the "´" character, and making some tests in google.es, I found slightly different results when writing in both ways, but not for my listing, which rank exactly in the same position for both "words".
Does anybody have some deeper information related to the topic?
Daniel
Dear All,
Today I noticed that our page authority and domain authority is down, and the number of linking domains is also down. We do not have too many linking domains to our site, but at least one blogger link to us, with a very nice article about our site (since some months ago) and this link does not appear in the site explorer (follow link). This blog is well indexed by google, but impossible to see using open site explorer for some reason unknown to me; and two pages we helped, and they link also to us (follow links): their links appear in the webmaster tools, but not in the site explorer. Neither the blog nor those two pages were in the site explorer before, the links that were removed are from other sources (SEO analisys pages, automatic generated when I run a test by them, so I supossed not high quality links at all, not content related, we don't miss them).
Our site was partially down for 1 1/2 day this week due change of server, from a shared hosting to a VPS. Now all pages are up again, and we are improving everything, working really hard (almost no sleep at all in two days). Could it also have something to do with the lower domain authority? How bad penalized google the time down of a domain? How long will we feel the effects of that?
Listing published in our site are ranking exactly as before, but I am afraid this could change in the near future.
Since the end of December (not directly related to my question), we do not have this nice snipped showing the main pages of our site, when we writte the exact domain name in google. At the time, we did not any change, and this did not affect the listings ranking (they even improved in some cases); we rank better for the main keyword, so the snipped changed, without other problems.
So, the questions:
Have made site explorer any change in the last days, to discard low quality links?
Could be the lower domain authority related to the down time of our site?
How long can be felt the effects of the down time in google results? Is it something that can be felt forever? =(
Do you have any idea why google decides in some cases to remove the site links in the snipped? General question, I know it is not possible to answer it for any specific domain without looking at it. We would like to have those links again.
Is there something we can do in order site explorer visit that blog I mentioned? I will try to crawl it using the dashboard to see what happen.
Many thanks for your help!
Daniel
Dear all,
I want to create an additional domain in order to:
Rank better for a very specific keyword with an exact match domain (I already ask about that, but I did not have my ideas clear at the time);
Offer to the user usefull infomation about the topic, without duplicating the content I have in my main domain, just additional and very specific information;
Use this domain as landing page, offering a tutorial on "how to" use a specific section of my main domain, including a video tutorial;
Link to the related section of my main domain.
So, the main idea is, if an user type in google "this specific keyword of ours", they will have in the results "thisspecifickeywordofus.es", they will click and go to the site, where they will find unique and specific information, complementing what I have in my mainsite, and showing how to use my site, so trying to use it for conversion.
I want to do only white hat SEO, so first at all, I would like to ask you if you think it is a good idea. The keyword is difficult to rank for, and if I can take advantage of this exact match domain (even if it is nowdays no more so big an advantage), would be great.
Second, do you see any problem in managing different domains from the same google account? Newbee question, sorry.
Thanks in advance for your help,
Daniel
Dear all,
Our main site is a bussiness directory, and following some SEO advices, we are creating landing pages for each category, in order to optimize them for the keywords. Those landing pages have links to the listings related to them.
Using the same idea, we have created pages related to the regions, and those pages include links to the listings located in them.
The only problem that I see with that, is the number of links that some categories or regions could have. Is there a limit of recomended number of links per page, from a SEO perspective?
We also have a main category page, that includes a list of all categories, and this page could also have a relatively high number of links.
The pages have around 300 to 500 words, some include also images, some include videos.
Many thanks for your help,
Daniel
Hi,
I am not an expert, so please do not take my answer very seriously. What you mention, of making a canonical tag pointing to the same URL, looks fine. In my understanding, canonical tags were created to tell the search engines that a page is the right one, even if the system you are using creates address that could look like duplicate content. For example, if you are using a Content Management System like wordpress or Joomla, you could have the following:
http://domain.com/date/month/page1 and so on.
Search engines (again, I am not sure, I am just a newbee), could think all this pages are duplicate content, and could penalize you for this. But if you indicate with the canonical tag that the right url is http://domain.com/page1, then you are safe.
I hope somebody with more experience could help you better,
Best Regards,
Daniel