Is There A Way To Automatically Ping Your New Content
-
Hi I believe that if you have wordpress then you can automatically ping your site with new content but I am using joomla and would like to know if you can do it with this type of site.
I write lots of articles each day and it takes time to keep ping each article so i am wondering if there is an easy way of doing it.
-
There is a field in the Dashboard of Word Press where you can place the URLs of services you want to ping and it will automatically do so when you update.
However, if you would like to do so manually and hit a list of different services, try Ping-o-Matic.
-
many thanks this is excellent, strange because i did search the extensions but must not have put in the correct search word, this will make my life a lot easier.
-
yes of course you can be looking modules for pinger try search on google.
-
Hi Diane,
Have you tried any of these? Joomla has an extension directory that lists several pinging extensions. Also, a Bing search on [joomla ping] turned up several useful results with detailed explanations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
404 or 503 Malware Content ?
Hi Folks When it comes to malware , if I have a site that uses iframe to show content off 3rd party sites which at times gets infected. Would you recommend 404 or 503 ing those pages with the iframe till the issue is resolved ? ( I am inclined to use 503 .. ) Then take the 404/503 off and ask for a reindex ( from GWT malware section ) OR Ask for a reindex as soon as the 404/503 goes up. ( I do understand we are asking to index as non existing page , but the malware warning gets removed ) PS : it makes sense for this business to showcase content using iframe on these special pages . I do understand these are not the best way to go about SEO.
Technical SEO | | Saijo.George0 -
External video content in iframe
Hi, On our site we have a lot of video content. The player is hosted by a third party so we are using an iframe to include the content on our site. The problem is that the content it self (on the third party domain) is shown in the google result. My question is: Can we ask the third party to disallow the content from indexing in their robots.txt or will that also affect our own use of the video content? For example we use video-sitemaps to include the videos in Google video search (the sitemap links to the videos on our own domain, but we are still using iframes on the pages to collect the content from the third party domain that will then be blocked by robots.txt). I hope you understand what I mean... Any suggestions? Thanks a lot!
Technical SEO | | Googleankan0 -
Does Google know what footer content is?
We plan to do away with fixed footer content and make, for the most part, the content in the traditional footer area unique just like the 'main' part of the content. This begs the question, do Google know what is footer content as opposed to main on page content?
Technical SEO | | NeilD0 -
Pages with content defined by querystring
I have a page that show traveltips: http://www.spies.dk/spanien/alcudia/rejsemalstips-liste This page shows all traveltips for Alcudia. Each traveltip also has its own url: http://www.spies.dk/spanien/alcudia/rejsemalstips?TravelTipsId=19767 ( 2 weeks ago i noticed the url http://www.spies.dk/spanien/alcudia/rejsemalstips show up in google webmaster tools as a 404 page, along with 100 of others urls to the subpage /rejsemalstips WITHOUT a querystring. With no querystring there is no content on the page and it goes 404. I need my technicians to redirect that page so it shows the list, but in the meantime i would like to block it in robots.txt But how do i block a page if it is called without a querystring?
Technical SEO | | alsvik0 -
Prospective new client it by webspam looking for new resource
Background:
Technical SEO | | tcmktg
Prospective client recently hit by webspam update. (I have verified hundreds of low-quality links, porn links, backlink exchanges etc.) They want us to step in and remove bad links and start over. Question:
What is the best way to examine all the links to determine which need to be removed? We can create the report from open site, but how can we identify the bad links? Here are the site metrics. 5000+ linking domains, so in this example we need to research the 5000 links, and possibly send notifications to thousands of webmasters to remove the links? Open site states about 25,000 total links, but root links are shown below. Yikes. Domain Authority 75
External Followed Links 112,000
Total External Links 115,000
Total Links 150,000,
Followed Linking Root Domains 3,900
Total Linking Root Domains 5,300
Linking C Blocks 2,7000 -
Entry based content and SEO
My E-commerce team is implementing functionality that allows us to display different content based on what channel and even what keyword the customers used to reach our page. This is of course a move that we believe will strengthen our conversion rates, but how will this effect our organic search listings? Do you guys have any examples of how this could affect us, and are there any technology pitfalls that we absolutely need to know about?
Technical SEO | | GEMoney_No0 -
Duplicate Content issue
I have been asked to review an old website to an identify opportunities for increasing search engine traffic. Whilst reviewing the site I came across a strange loop. On each page there is a link to printer friendly version: http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes That page also has a link to a printer friendly version http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes&printfriendly=yes and so on and so on....... Some of these pages are being included in Google's index. I appreciate that this can't be a good thing, however, I am not 100% sure as to the extent to which it is a bad thing and the priority that should be given to getting it sorted. Just wandering what views people have on the issues this may cause?
Technical SEO | | CPLDistribution0