[SOLVED] Automatically have phpList Retrieve/Update RSS Feed and Delete Feed Items older than 1 day

This is the closest I’ve been in the past 2 months with probably 60 hours of learning to actually getting phplist to finally work for sending daily RSS updated newsletters to email subscribers.

I currently have it able to send automatically but now I hit a roadblock with having the RSS feed update automatically.

According to phpList instructions here: https://resources.phplist.com/plugin/rssfeed#retrieving_rss_feed_items, this can be done using a cron job. I am hosting my site on iPage which doesn’t support cron jobs apparently so the way I see it I have three options:

  1. Log into my site’s phplist once a day before the newsletter is sent to update the RSS feed which is one click but requires logging in and remembering to log in.
  2. Put together a linux machine (probably a raspberry pi) to be able to run a cron job.
  3. Use a site like cron-job.org in order to schedule the cron job to run.

I think option 3 would be easiest so long as it works (I have not tested it). However with cron-job when I try to set up the cron, it is asking for what appears to be a web address URL (I typed Retrieve RSS feed in there as the title). Is this where I put the command shown in the phplist instructions posted in the phplist link above? Or do I put only part of this here. Further down the cron-job setup page, there is advanced options. Perhaps I have to enter part of the command in this line.

I am wondering if anyone here has any suggestions on which option would be best for me. I don’t have any experience with cron jobs and have limited experience with linux.

Greatly appreciated…

@bargainbrother The cron-job.org site appears to request a URL, i.e. a phplist page, which is different to a cron job running on your own web server that runs as a command line. The documentation is referring to the latter, which is the recommended way of doing it.

You can try using the URL for the “Fetch RSS items” page which will be something like


but you also need to add an admin user name and password to the URL


The real drawback of this is that a phplist admin id and password are being stored on cron-job.org and being sent over the internet.

1 Like

Very cool Duncan! That worked.

The other way I was attempting this was to create an “agent” php file on my web host and then execute that via the URL.

I got as far as making the file which contains:

<?php echo shell_exec("/lists/admin/index.php -p get -m RssFeedPlugin -c /lists/config/config.php"); ?>

However, executing this was not working for me. Does this method make sense to you?


@bargainbrother I don’t think that it is worth trying to get your other approach to work. But you need to use the full paths to the files, not just the path from the web root.

I have upgraded the plugin so that you can use the phplist remote processing secret instead of an admin name and password. That reduces the risk of unauthorised access to phplist. You can get the secret value from the Settings page, then use a url similar to this


To use this approach you need to upgrade the RSS Feed plugin and also Common Plugin on the Manage Plugins page.


In my endeavors to figure out how to make it work, I tried what you just added early yesterday but of course it didn’t work yet. Thank you very much sir for adding it. I will implement it today.

Since you’ve been so helpful, would you know if there could be a URL to delete RSS feed items older than X days? (1 day for me).

I think potentially you could use your URL but change “get” to “delete” but then you would need to somehow specify days to delete.

Of course, this could all be unnecessary if the plugin only sends new posts that it has not sent before which i believe it does do already.

@bargainbrother The delete page cannot be run from a cron job, only through a browser. But there is no great reason why it could not, so I will look into that.

To confirm, the plugin doesn’t send feed items more than once. The purpose of the delete function is simply to avoid the database tables getting too large. Using the admin browser page once a week or so should be sufficient.

1 Like