Is Tag Manager a good option to insert text in websites?
-
Is Tag Manager a good option to insert text in websites?
When a website doesn't have an administration panel adding text is a very big problem.
-
If you can do it: Yes! If it's easy: Quite easy if you know basic JavaScript. If it's a good option: Maybe, but probably not. You can use GTM to quickly make some changes but if you want to make more structural changes you definitely don't want to use GTM as a way to hack your site around.
-
While it is possible, it is not easy to set up nor would i recommend it. If you are having issues to the level that you are considering using GTM as your CMS, it is time to invest in a proper CMS for your website.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
Website crawl error
Hi all, When I try to crawl a website, I got next error message: "java.lang.IllegalArgumentException: Illegal cookie name" For the moment, I found next explanation: The errors indicate that one of the web servers within the same cookie domain as the server is setting a cookie for your domain with the name "path", as well as another cookie with the name "domain" Does anyone has experience with this problem, knows what it means and knows how to solve it? Thanks in advance! Jens
Technical SEO | | WeAreDigital_BE0 -
301 redirection for e-commerce website
Hi moz community, I am the web agency for a e-commerce website. Its current domain is https://www.liquorland.co.nz but now all the e-commerce part will be moved to a sub-domain https://shop.liquorland.co.nz. There are thousands of e-commerce current being indexed in Google (i.e., 15,500) plus they also have a mobile version of the page like /mobile/default.aspx. Is it necessary to 301 redirect all the pages? We are afraid it may slow down the website because the request will go through thousands of filters. Is it OK to just redirect the main categories? Many thanks in advance.
Technical SEO | | russellbrown0 -
Server 500: website deindexed?
Hi mozzers, Since August 22nd, (not a site I manage) has had a Server error 500 and all the pages got deindexed? This is obviously a server issue but why it got deindexed is it because it's been a while since it had this server issue? On the pages I checked the pages loads correctly so I am a bit confused here! His webmaster account show 1500 server errors! Can someone tell me what is going on and how to fix it? Thanks
Technical SEO | | Ideas-Money-Art0 -
Two companies merge: website A redirect 301 to website B. Problems?
Hi, last december the company I work for and another company merged. The website of company A was taken offline and the home page was 302 redirected to a page on website B. This page had information about the merger and the consequences for customers. The deeper pages of website A were 301 redirected to similar pages on website B. After a while, the traffic from the redirected home page decreased and we thought it was time to change the redirect from a 302 into a 301 redirect to the home page. Because there are still a lot of links to the home page of website A and we wanted to preserve the link juice. Two weeks ago we changed the 302 redirect from website A into a 301 redirect to the home page of website B. Last week the Google webmaster tools account of website B showed the links from the 301 redirected website A. The total amount of links doubled and the top anchor text is the name of company A instead of company B. This, off course, could trigger an alarm at Google. Because we got a lot of new links with a different anchor text. A tactic used by spammers/black-hats. I am a bit worried that our change will be penalized by Google. But our change is legit. It is to the advantage of our customers to find us if they search for the name of company A or click on a link to website A. We didn´t change the change of address of domain A in Google webmaster tools yet. Is it a good idea to change the change of address of domain A into domain B? Are there other precautions we can take?
Technical SEO | | NN-online0 -
Websites not being included on google from mobiles?
Hi, Just had a call from a guy saying that google have made a statement saying that it will be stopping people finding websites from mobule devices if they dont have a mobile domain name. Doesn anyone know anything about any Google statements or is this just rubbish?
Technical SEO | | Ant710 -
Google Website Optimizer
So if you are AB testing two pages: index.html and indexB.html Shouldn't I nofollow indexB.html? It has all the same content, just a different design.
Technical SEO | | tylerfraser0 -
Website has been penalized?
Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!
Technical SEO | | bigtimeseo0