We have multiple family lines of products. this one family line has been pretty stagnant and an seo firm pointed that out to us in trying to gain our business. I wanted to see if there is any merit to their suggestion.
- Home
- anthonytjm
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
anthonytjm
@anthonytjm
Job Title: Senior Web Developer
Company: TJM Promotions
Website Description
We design and manufacture promotional products for awareness, promotions ad custom applications.
Favorite Thing about SEO
Getting Ranked!
Latest posts made by anthonytjm
-
RE: Are templates considered duplicate content?
-
Are templates considered duplicate content?
We have a line of products that are all using the same template or shell for a website structure. All have different content relating to a specific product or service, but being its a line of different products under one family, we use the same colors and template structure for consistency and branding purposes.
It was just brought to my attention that using a template like this across multiple sites could raise duplicate content flags as google is reading the same template code and may not differentiate that its a family product line of sites.
Does anyone have any feedback on whether this could be true or not?
-
RE: What are your favorite tactics for getting links to money-pages?
We worked on a website that sells silicone wristbands. Not an exciting product anymore by any means. What we did to help create a buzz was partner with various awareness causes during a given month and posted polls and funds raised results on landing pages. Then we had the awareness cause we worked with link to those landing pages where visitors can see the amount of money raised, read more info about the cause and even place an order to support the cause.
If client has a mailing list of any significant size, you can send out email blasts offering discounts, free upgrades or other incentives that are further explained on landing pages and build traffic and possibly links to that page as well.
Im also looking forward to hearing some strategies others may have to offer.
-
What do you use for site audit
What tools do you use for conducting a site audit? I need to do an audit on a site and the seomoz web crawler and on page optimization will takes days if not a full week to return any results.
In past Ive used other tools that I could run on the fly and they would return broken links, missing htags, keyword density, server information and more.
Curious as to what you all use and what you may recommend to use in conjunction with the moz tools.
-
RE: Whos going to Mozcon next month
OK we are getting closer! Im signed up for the Mozplex tour Tuesday afternoon for 4pm. Anyone else going to be on this tour? Maybe meet up for dinner, drinks and networking afterwards? Feel free to message me to swap numbers or make plans.
-
RE: Two websites with similar products
Bilal, since you already have one at the top of google why not come up with a different strategy to try and get the second site moving up in google rankings? Sounds like it cold be a good springboard to try some experimentation in optimizing for better ranking.
Unless your simply wanting to merge the sites and concentrate solely on the one site.
-
Whos going to Mozcon next month
Off topic question.
Who's going to Mozcon next month and whats on the agenda for the Tuesday evening before the event starts?
Anyone interested in having a meet and greet and talking shop over drinks or dinner the Tuesday before (July 24th)?
-
RE: Could you use a robots.txt file to disalow a duplicate content page from being crawled?
per google webmaster tools:
If Google knows that these pages have the same content, we may index only one version for our search results. Our algorithms select the page we think best answers the user's query. Now, however, users can specify a canonical page to search engines by adding a element with the attribute
rel="canonical"
to the section of the non-canonical version of the page. Adding this link and attribute lets site owners identify sets of identical content and suggest to Google: "Of all these pages with identical content, this page is the most useful. Please prioritize it in search results." -
RE: Could you use a robots.txt file to disalow a duplicate content page from being crawled?
Well, the answer would be yes and no. A robots.txt file would stop the bots from indexing the page, but links from other pages in site to that non indexed page could therefor make it crawlable and then indexed. AS posted in google webmaster tools here:
"You need a robots.txt file only if your site includes content that you don't want search engines to index. If you want search engines to index everything in your site, you don't need a robots.txt file (not even an empty one).
While Google won't crawl or index the content of pages blocked by robots.txt, we may still index the URLs if we find them on other pages on the web. As a result, the URL of the page and, potentially, other publicly available information such as anchor text in links to the site, or the title from the Open Directory Project (www.dmoz.org), can appear in Google search results."
I think the best way to avoid any conflict is applying the rel="canonical" tag to each duplicate page that you don't want indexed.
You can find more info on rel canonical here
Hope this helps out some.
-
RE: Changing Link Title Tags & Backlinks
1. Yes, the link juice will eventually be applied to the new page. This is part of the benefit behind 301 redirect.It sometimes just takes a little time for it to process and show up in reports.
2. The link juice or link equity will eventually pass through to new url once Google has established and indexed the new page. This could take weeks to months.
3. I would continue to optimized on a regular basis. Make sure that after each 301 redirect you also adjust all internal linking to the new page, especially in your sitemaps! Other wise it could get confusing for both visitors and bots.
Hope this helps.
Best posts made by anthonytjm
-
RE: Could you use a robots.txt file to disalow a duplicate content page from being crawled?
Well, the answer would be yes and no. A robots.txt file would stop the bots from indexing the page, but links from other pages in site to that non indexed page could therefor make it crawlable and then indexed. AS posted in google webmaster tools here:
"You need a robots.txt file only if your site includes content that you don't want search engines to index. If you want search engines to index everything in your site, you don't need a robots.txt file (not even an empty one).
While Google won't crawl or index the content of pages blocked by robots.txt, we may still index the URLs if we find them on other pages on the web. As a result, the URL of the page and, potentially, other publicly available information such as anchor text in links to the site, or the title from the Open Directory Project (www.dmoz.org), can appear in Google search results."
I think the best way to avoid any conflict is applying the rel="canonical" tag to each duplicate page that you don't want indexed.
You can find more info on rel canonical here
Hope this helps out some.
-
RE: What are your favorite tactics for getting links to money-pages?
We worked on a website that sells silicone wristbands. Not an exciting product anymore by any means. What we did to help create a buzz was partner with various awareness causes during a given month and posted polls and funds raised results on landing pages. Then we had the awareness cause we worked with link to those landing pages where visitors can see the amount of money raised, read more info about the cause and even place an order to support the cause.
If client has a mailing list of any significant size, you can send out email blasts offering discounts, free upgrades or other incentives that are further explained on landing pages and build traffic and possibly links to that page as well.
Im also looking forward to hearing some strategies others may have to offer.
-
RE: Two websites with similar products
Bilal, since you already have one at the top of google why not come up with a different strategy to try and get the second site moving up in google rankings? Sounds like it cold be a good springboard to try some experimentation in optimizing for better ranking.
Unless your simply wanting to merge the sites and concentrate solely on the one site.
I first got started in web design back in 1999 working as a freelancer. Seven years ago I moved to Ocala, Florida and started working for TJM Promotions. We manufacture and design just about every promotional product under the sun.
Looks like your connection to Moz was lost, please wait while we try to reconnect.