Blog Post Relevance vs Traffic
-
Hello Moz community,
I have a website (lets say that I am selling tshirts), and want to create a blog (www.website.com/blog). With that blog, I would like to post about random topics including travel, leisure, events, new products etc. I want to do this for 1. to get affiliate commissions, 2. to increase web traffic, 3. to potentially increase my domain authority.
I would like to know if this blog with random posts would negatively affect my website ranking and keyword ranking (for t-shirts), as it is not relevant to the products that I am selling (and the keywords that I am ranking for).
As Google currently understands that my website is a webstore selling t-shirts (and I am ranking for the tshirt keywords), but the algorithm get confused if I start posting about travel and random topics? Your help greatly appreciated in the matter.
-
Comment pouvons nous gérér les erreurs 404 des sites d'annonces, car nous avons une erreur 404 chaque fois quelqu'un supprime une annonce. Exemple maroc annonce immobilier ou maroc annonce
-
@pturbo No if you post regularly on different topic the google will understand your website content and do rank it for other topic as well, if the content is unique, userfriendly, and informative. However, you have to post your content regularly
-
Bonjour
J'ai ajouté une plugin blog sur mon site de petites annonces : Annonces Gratuites Et ça a marché. Il est important que les sujets soient liés aux mots-clés -
I would suggest to create a Blog to publish articles that explain anything related to your business. If you can tell stories about the process of creating your product, what have inspired the creating of certain product, this would be a very good content, which adds up.
-
Hi!
I'm in the process of doing similar and I also working on a tshirt store peacefulconquest.com (work still in progress). From what I have reasearched (and what Veronica already said) it won't negatively affect your main website ranking since you are simply targeting new keywords for your site. You are not, so to say "dilluting" your existing website since your new content is published on your blog (different URLs).
However, I would still try to produce content for your blog that is at least remotely linked to your main topic since that would also help to boost your sales. What's the point of increased traffic if it is unrelated to what you do and doesnt convert?
-
Hi,
Interesting question!
As far as I know it will not damage the main business, from algorithms' "point of view", you will have a broader amount of keywords associated by the algorithms to the website, although it will be hard to positionate those pages unrelated to the main content. Also depending on the bounce rate and if those pages attract visitors.Said that, like in everydays life you do not trust on a business with multiple faces i.e., your buyers wish to see content related to the t shirts, fabrics and designs.
A professional website needs to be coherent, therefore if you plan to set up a blog I suggest that you post about the main business. Good luck and Happy New Year 2020.
Mª Verónica
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
For a parent blog on our website, what should we go for - Subdomain or Subdirectory?
We are a kids website with fun learning content dedicated to kids aged 6-14 yrs, now we want to start a blog page for parents with parenting tips and information useful to parents. For this, should we choose - a subdomain or subdirectory?
SEO Learn Center | | Mocomi0 -
HTTP vs HTTPS
What are the pros and cons of each? Looking for more information about this. Thank you ahead of time.
SEO Learn Center | | Essential-Pest0 -
Site for my clients to log in and see their traffic, etc.
I have done a ton of research and I am struggling to find an easy to use, easy to understand site / tool that will allow my clients to log in and see basic information about their website traffic, rankings, referring sites, etc. in an ATTRACTIVE, EASY TO UNDERSTAND layout. Any suggestions would be greatly appreciated! Seomoz (my true favorite) Raven, webceo, and so many others are powerful tools - I don't need that, just looking for an attractive place to log in and view their stats, that's it. Thank you - have a great rest of the week! Matthew
SEO Learn Center | | Mrupp440 -
Help Finding Specific SEOMoz Blog Post - Help meh puhleeze
Hi everyone! Hopefully everyone had a great weekend. I'm having trouble finding a specific blog post (from the SEOMoz blog), that was pertaining to content writing. I think (but could be wrong), that it was written by Rand, but I'm having the most difficult time finding it. The overall theme of the post was about Writing/Blogging about topics OUTSIDE of whatever your niche is. I didnt bookmark the thread (doh!), but the theme of it was to basically not just write about stuff pertaining to your niche on your blog. but to write about other topics as well (I.E if I have a site about blue widgets, not only writing about blue widgets). Is there anyone that is familiar with the post that I am referring to that could possibly point me in the right direction? Thanks a ton
SEO Learn Center | | bashseo0 -
Forecasting Seasonal Keyword Traffic with Python Script
A few weeks back, I went to a Distilled meetup here in NYC. SEER Interactive's Mark Lavoritano did some cool slides on the seasonality of keywords. Basically, his presentation made the point that you should not only think about which keywords you want to rank for but also WHEN they are most valuable. This made me think...we have a lot of moving parts to our marketing efforts. Emails with interchangeable modules, a homepage with interchangeable links, and other dynamic elements for which we have to decide what themes we want to market for the week. Babies or bikes? Kitchen Gadgets or Wine Glasses? Google Insights for Search is a great tool which allows you to look at keyword traffic year over year. However, for many of the keywords (like the ones mentioned above), on a multi-year timeframe, it can be tough to sift out the specific weeks in which traffic repeatedly peaks year after year. What I really wanted to see was the last 5 years laid on top of each other to find the common peaks. Even better, if I could map 5 years of keyword data to a single row in a spreadsheet and then use conditional formatting to create a colorscale, I could create a sweet forecasting calendar with several keywords and use this to choose the best timing for various marketing campaigns. Here's a link to a screenshot of the calendar I created: forecasting calendar I could have done this in excel, but I've been wanting to try out Python for a while now and decided this was a great time to do it. After some reasearch, I figured out how to import a csv into python and the rest was done with for loops and lists, which is fairly basic python. I've pasted my code below. In a nutshell, the program runs through all 5 years of traffic data and increments a count in a list whenever it sees a peak (according to a threshold called "peakInterestValue" that you set in the code). The output is a list of 52 numbers [0-5] (representing 52 weeks over 5 years). If the value is a 5, it means that all 5 years showed a peak in traffic at that week. If it's a 4, then 4 (out of 5) years showed a peak that week, etc...you can then copy/paste this to a row in an excel sheet with all your keywords, apply a color scale w/ conditional formatting, and boom! you've got a forecasting calendar. This code works on the exact file that Google insights exports so you don't need to format it at all. It's ready to rock. If you want to see the code formatted and cleaned up, check it out here. If you want to see a hot mess, I've also pasted the code below. You can drop it right into a .py file and run it off a cmd prompt but you'll need to install python first: http://www.python.org has installation info and great tutorials as well. Enjoy! import csv """reads a file from google insights""" """Open the last 5 years of data from Google Insights""" anniversarygiftFile2007 = csv.reader(open("anniversarygift2007.csv","r"))anniversarygiftFile2008 = csv.reader(open("anniversarygift2008.csv","r"))anniversarygiftFile2009 = csv.reader(open("anniversarygift2009.csv","r"))anniversarygiftFile2010 = csv.reader(open("anniversarygift2010.csv","r"))anniversarygiftFile2011 = csv.reader(open("anniversarygift2011.csv","r")) """Combines the data into a list""" anniversarygiftFile = [anniversarygiftFile2007,anniversarygiftFile2008,anniversarygiftFile2009, anniversarygiftFile2010,anniversarygiftFile2011] """counters"""i=0j=0 """flags used to initialize lists""" definedFlag=0 definedFlag2=0 for i in range(0,5): j=0 for row in anniversarygiftFile[i]: if j<=4: """skips the first 5 rows""" elif j==5: """initialized the list on the first week of data""" anniversarygift=[row[1]] definedFlag = 1 if(i==4): peakInterestWeeks=[0] else: """appends the list with each row""" anniversarygift.append(row[1]) if(i==4): peakInterestWeeks.append(0) if len(anniversarygift)>=52: print("i = ",i) if (i==0): if(definedFlag==1): anniversarygiftArray = [anniversarygift] definedFlag2 = 1 elif (definedFlag == 1): if(definedFlag2 == 1): anniversarygiftArray.append(anniversarygift) break j=j+1 i=i+1 """ Now all of the data is in python lists""" i=0 j=0 """ Lower peakInterestValue to lower the traffic threshold and discover more peaks """ peakInterestValue=90 """ This is a variable to help you tweak peakInterestValue""" peakInterestCnt = 0 for i in range(0,5): print("i =",i) for j in range (0,51): if int(anniversarygiftArray[i][j])>peakInterestValue: """If keyword interest peaks, peakInterestWeeks[] is incremented""" peakInterestWeeks[j]=+=1 peakInterestCnt +=1 print("Peak interest",peakInterestWeeks)print("Peak Interest Count =",peakInterestCnt) """peakInterestWeeks[] is printed out to a row in an excel file""" c = csv.writer(open("anniversarygift.csv", "w")) c.writerow(peakInterestWeeks)
SEO Learn Center | | znotes0 -
Subdomain Structure vs Top Level Domain Site; An Actual Case
A major non-profit with 360 locations just completed a new program. The non-profit has updated its site at URL www.mysite.com. Each of the 360 locations has a subdomain site with URL http://mylocation.mysite.com which reflects the branding of www.mysite.com and has a Drupal content management system. The company is asking each location to transition the top level domain of each location ( example www.mysite.com to http://mylocation.mysite.com). The subdomain location sites are tied to a CRM system, which initially hasn't worked. But the long term objective is to gather CRM data via the subdomain location sites. The main site has an Alexa Global Rank of 30,400 and a US Rank of 7900. The main site has 2600 backlinks, and is a pretty powerful site. Some individual locations in large metropolitan areas have decent backlinks, but most don't. The PR of most local sites is 3 to 6. The SEO experts at www.mysite.com have maintained that the power and authority of the main site www.mysite.com would propel the local rank of each location to a position of high visibility. Using Market Samuari and Majestic SEO, a test on each of the 360 location subdomains was performed. It was determined that Page Rank did pass from the main site to the local site in 15% of the cases. However, 85% of the subdomain sites had 0 Page Rank. Page Rank did vary from 2 to 5 on the 15% that passed rank, but there was no cross checking to see if the subdomain location had the same page rank as the locations top level domain. Referring Domains Domain (RDD) was usually -1; Referring Domains Page (RDP) was usually 0; Page Backlinks (BLP) was usually 0; Back Links to Domain (BLD) was usually 0; Google Cache age was all over the map from unknown to 58 days. Questions: Anyone suspect why some sites pass Page Rank and others don't; Should we continue on the path towards using the subdomain sites, or should each location keep its top level site and work on improving their local search visibility there? There is no way to check Google Analytics (GA) on the subdomain sites without having to access GA on the main site www.mysite.com. Right? Will Google Places and Bing Local honor the subdomain URLs for their respective Local Search pages. Would there be a compelling reason or reasons to abandon the subdomain sites? Thanks much for your comments!!
SEO Learn Center | | VernonWanner0 -
Best blogs to follow?
Hey Everybody Im fairly new to SEO and would like to ask which blogs would be beneficial to follow to keep ahead of the ever changing algorithms and methods. I'm already a avid reader of the SEOMOZ blog which I find a invaluable source of information. Any suggestions would be greatly appreciated. Thanks
SEO Learn Center | | CPASEO3