Nice find Ryan.
- Home
- AFW1179
Latest posts made by AFW1179
-
RE: Increase in pages crawled per day
When you say URL variables do you mean query string variables like ?key=value
That is really good advice. You can check in your GWT. If you let google crawl and it runs in to a loop it will not index that section of your site. It would be costly for them.
-
RE: Increase in pages crawled per day
There are two variables in play and you are picking up on one.
If there are 1,000 pages on your website then Google may index all 1,000 if they are aware of all the pages. As you indicated, it is also Google's decision how many of your pages to index.
The second factor which is most likely the case in your situation is that Google only has two ways to index your pages. One is to submit a sitemap in GWT to all of your known pages. So Google would then have a choice to index all 1,000 as it would then be aware of their existence. However, it sounds like your website is relying on links. If you have 1,000 pages and a home page with one link leading to an about us page then Google is only aware of two pages on your entire website. Your website has to have a internal link structure that Google can crawl.
Imagine your website like a tree root structure. For Google to get to every page and index it then it has to have clear, defined, and easy access. Websites with a home page that links to a page A that then links to page B that then links to page C that then links to page D that then links to 500 pages can easily lose 500 pages if there is an obstruction between any of the pages that lead to page D. Because google can't crawl to page D to see all the pages on it.
-
RE: An SEO Strategy (need review)
Ryan, I would like your participation in the discussion. However, I fear you are hearing what you want to hear and not what I am saying. I didn't say I wanted to charge based on the number of links I build. I want to charge based on results. I don't believe I will achieve consistent, sustainable results marketing content and praying someone links to the content.
For some industries that strategy works great. SEO, Web Design, etc... Where the average user has a website and understands how to use it. For real estate, there is not a lot of people going to link to a real estate website. Certainly no real estate agent will link to another one. They could lose their clients. Furthermore, the industry is dominated by brands. None of those brands will link to agents. Even the agent profile pages don't have links. The ones that do you must pay for and they are no-follow.
I'm asking you to participate in the discussion but understand there is not one strategy to seo in all niches. People will not link to real estate websites. It is why if you pull up your local agent websites you will see they have DA around 10. Just google [city] real estate and [city] homes for sale. You will see the brand sites dominate the serps. Second page you will start seeing local realtor websites. All of which have nearly no authority.
Help me out! As a community we can think of something.
-
RE: An SEO Strategy (need review)
I appreciate your help. I am looking to discuss link building as a topic. Not that what you suggest would not build links but in my opinion the methodology you suggest is more along the lines of content marketing with a secondary benefit of link acquisition. I am looking for cold solid links on site. In my (significant) experience, clients won't pay for content marketing if it's not getting results. It can't get results consistently because of so many variables. A link builder needs to be building links.
-
RE: Duplicate title tags due to lightbox use
Keszi is that really necessary? As a programmer I find it difficult to believe that Google would require such a ridiculous thing. Any url can have a query string appended and then a link built to that query url+query string. Query strings simply add a ? then a key=value pair to the url. If this were necessary I could go to my competitors website and just add a query string and build a bunch of backlinks to it. Google would then consider all their website duplicate content because the website root exists and a website root version with a query string on the same page.
What I am getting at is that query strings don't require such action. Something on the client side would never require server side action by any third party (google or otherwise). It would basically be to easily manipulated.
However, there are times that the website uses valid query strings to sort/filter or even deliver content (like modern CMS joomla, drupal, wordpress). When you query a database with a query string normally there is significant code that handles the URL and makes it a url in it's own right. You don't see ?p=23. You see a url friendly version.
If your website has query strings that sort/filter and the site is not sophisticated enough to manipulate those query strings with .htaccess to deliver SEO friendly urls I would suggest reading/utilizing this google page/tool.
https://support.google.com/webmasters/answer/6080548?hl=en
This explains this EXACT issue. When websites expose query strings that enable the site to sort/filter.
-
RE: An SEO Strategy (need review)
I don't think we are going in the same direction.
How would you build backlinks for a real estate client? Avoid trying to sell me on "create content" because no one will link to it. Real Estate is dominated by brands like Zillow, Redfin, Trulia, Realtor.com etc... Agents websites on avg have about a 10 DA. There isn't a lot to work with. I am looking for ideas.
-
RE: An SEO Strategy (need review)
The videos and images are of homes for sale. It is unlikely that individuals will use them on any other website. The goal is that building links from the profile pages/channels/video excerpts.
-
An SEO Strategy (need review)
I work in the real estate vertical. My clients possess significant content. Though it's not written. They have tons of images and plenty of videos. They have content in the form of descriptions of homes etc... They don't have written content that would be valuable in an attempt to rank. Most traffic in real estate vertical is [city] real estate and [city] homes for sale. Agents rarely ever use those phrases. Certainly not when doing what they do, promoting their listings.
I am thinking I need to build a link building strategy around their videos and photos. There are tons of high domain sources to get links from. With video I could do youtube, vimeo, veoh, daily motion, hulu, etc... All of these sites are DA 90+. None of the links are follow. They would all be no follow. I could have a profile back link, and a back link on each video. So one video distributed to 10 sites would be worth 10 back links. So a client would build hundreds of backlinks a year. All of value. I could deep link all the back links to appropriate subdivision landing pages (long tail).
The same strategy is applicable with photos. There are dozens of high DA sites that syndicate images. All would result in a lot of links that are high DA with no-follow.
Please discuss this strategy. Also, if you can think of another strategy to build back links for real estate then please share it. I want to discuss real ground level back link building. Not "just build content and they will come." I need the sites to rank. I don't know if no-follows will even help them rank for long tail keywords.
-
RE: Wordpress 404 Errors
I just looked at the site and it doesn't appear to be on Wordpress. Is it on wordpress right now?
Is that site on Wordpress?
Best posts made by AFW1179
-
RE: Keyword research for new website
It sounds like you are doing keyword research for SEO. Though you mentioned the keyword planner keywords with high search volume and low competition. You know that low competition descriptor in the keywords planner is for SEM (search engine marketing - PPC). It has nothing to do with the difficulty of ranking the keyword. MOZ has a tool that you would have to scrub every keyword in to get the ranking difficulty.
Do not simply rely on your sales team for keywords. I would suggest you start from scratch yourself. Find the traffic yourself. If you let someone direct you you may miss great veins of traffic.
-
RE: Duplicate title tags due to lightbox use
Keszi is that really necessary? As a programmer I find it difficult to believe that Google would require such a ridiculous thing. Any url can have a query string appended and then a link built to that query url+query string. Query strings simply add a ? then a key=value pair to the url. If this were necessary I could go to my competitors website and just add a query string and build a bunch of backlinks to it. Google would then consider all their website duplicate content because the website root exists and a website root version with a query string on the same page.
What I am getting at is that query strings don't require such action. Something on the client side would never require server side action by any third party (google or otherwise). It would basically be to easily manipulated.
However, there are times that the website uses valid query strings to sort/filter or even deliver content (like modern CMS joomla, drupal, wordpress). When you query a database with a query string normally there is significant code that handles the URL and makes it a url in it's own right. You don't see ?p=23. You see a url friendly version.
If your website has query strings that sort/filter and the site is not sophisticated enough to manipulate those query strings with .htaccess to deliver SEO friendly urls I would suggest reading/utilizing this google page/tool.
https://support.google.com/webmasters/answer/6080548?hl=en
This explains this EXACT issue. When websites expose query strings that enable the site to sort/filter.
-
RE: Increase in pages crawled per day
There are two variables in play and you are picking up on one.
If there are 1,000 pages on your website then Google may index all 1,000 if they are aware of all the pages. As you indicated, it is also Google's decision how many of your pages to index.
The second factor which is most likely the case in your situation is that Google only has two ways to index your pages. One is to submit a sitemap in GWT to all of your known pages. So Google would then have a choice to index all 1,000 as it would then be aware of their existence. However, it sounds like your website is relying on links. If you have 1,000 pages and a home page with one link leading to an about us page then Google is only aware of two pages on your entire website. Your website has to have a internal link structure that Google can crawl.
Imagine your website like a tree root structure. For Google to get to every page and index it then it has to have clear, defined, and easy access. Websites with a home page that links to a page A that then links to page B that then links to page C that then links to page D that then links to 500 pages can easily lose 500 pages if there is an obstruction between any of the pages that lead to page D. Because google can't crawl to page D to see all the pages on it.
Looks like your connection to Moz was lost, please wait while we try to reconnect.