Support

Akeeba Backup for Joomla!

#28129 Error - Timing Out - on CLI backup only

Posted in ‘Akeeba Backup for Joomla! 4 & 5’
This is a public ticket

Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.

Environment Information

Joomla! version
n/a
PHP version
n/a
Akeeba Backup version
n/a

Latest post by on Sunday, 27 August 2017 17:17 CDT

GJSchaller
When attempting to run a CLI backup of a specific site using a CRON job, it fails. The error from the ALICE analyzer is:

------ BEGIN OF ALICE RAW OUTPUT -----
Timeout while backing up
There is already an issue with the backup engine saving its state. Please fix it before continuing.

------ END OF ALICE RAW OUTPUT -----

The settings suggested are already in place (1 sec / 10 sec / 75%). Of note:

- Host is SiteGround
- Backend backup of this site works without issue
- Other sites on same server do not have this issue
- This site is large, due to a very large photo gallery, and forums.

Log is attached.

Thank you for the help!

nicholas
Akeeba Staff
Manager
It looks like SiteGround has set the timeout for CRON tasks to 10 minutes which, if I recall correctly, is their default for their mid-range hosting plans. Since the backup is taking longer than that, the operating system is killing the CRON process. Ask them to raise that limit for you. Since you have already taken a backend backup find out its duration from the Akeeba Backup interface (the Manage Backups page). Tell SiteGround that this is about how much you want the limit raised to and please do tell them you need it to take a full site backup with Akeeba Backup.

Nicholas K. Dionysopoulos

Lead Developer and Director

πŸ‡¬πŸ‡·Greek: native πŸ‡¬πŸ‡§English: excellent πŸ‡«πŸ‡·French: basic β€’ πŸ• My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

GJSchaller
I opened a case with SiteGround support, and they modified my CRON job to use PHP CLI instead of just PHP - we'll see on Thursday if it works. :-)

GJSchaller
The job was run using the following command:

/usr/local/bin/php-cli /home/(accountname)/public_html/cli/akeeba-backup.php

The backup ran, but failed again after about 10 MB. The web interface backup ran without issue.

Logs from the CLI backup on July 20th are attached.

nicholas
Akeeba Staff
Manager
There is no indication of what happened. It was backing up a table when it was killed. Is it possible that you are on a shared or VPS hosting plan with CPU usage or other kinds of process limits (ulimit in LInux)? That would explain why the CRON server is killing the CRON job abruptly.

Nicholas K. Dionysopoulos

Lead Developer and Director

πŸ‡¬πŸ‡·Greek: native πŸ‡¬πŸ‡§English: excellent πŸ‡«πŸ‡·French: basic β€’ πŸ• My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

nicholas
Akeeba Staff
Manager
In case the inference wasn't clear to you as it is to me: running the backup from CLI is a single process. Running the backup from the web interface is several processes, each one running for a limited amount of time (each page load is another process). These web processes are handled by the web server, Apache, not the CRON daemon.

Let's also try something different. Try using the akeeba-altbackup.php script for your CRON job. This uses the legacy front-end backup feature which runs the backup through the web server. The CRON job simply sits there and waits for each page load to end, then goes to the next until the backup is done. This will give us more data points as to whether the problem is with the CRON processes being killed.

Nicholas K. Dionysopoulos

Lead Developer and Director

πŸ‡¬πŸ‡·Greek: native πŸ‡¬πŸ‡§English: excellent πŸ‡«πŸ‡·French: basic β€’ πŸ• My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

GJSchaller
OK, thank you - I am assuming that this is NOT the CLI version of PHP... the new command is:

/usr/local/bin/php /home/(accountname)/public_html/cli/akeeba-altbackup.php

Is that correct? If not, I'll change it before Thursday when the job runs again.

Thank you!

nicholas
Akeeba Staff
Manager
I believe that /usr/local/bin/php is PHP-CGI which has time limits applied to it and it's not suitable for CRON jobs. That explains a lot.

Try this instead:

/usr/local/php56/bin/php-cli /home/(accountname)/public_html/cli/akeeba-backup.php

and if it fails give it a go with the altbackup script:

/usr/local/php56/bin/php-cli /home/(accountname)/public_html/cli/akeeba-altbackup.php

I know that on the server SiteGround provisioned to us the /usr/local/php56/bin/php-cli path is PHP 5.6 CLI.

Nicholas K. Dionysopoulos

Lead Developer and Director

πŸ‡¬πŸ‡·Greek: native πŸ‡¬πŸ‡§English: excellent πŸ‡«πŸ‡·French: basic β€’ πŸ• My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

GJSchaller
Will do, thank you.

I'm running PHP 7 on my sites, and as my server default version - would that make a difference?

nicholas
Akeeba Staff
Manager
Not at all. I am developing Akeeba Backup on PHP 7.0 and testing it with PHP 5.4, 5.5, 5.6, 7.0 and 7.1 (Joomla! still doesn't run very well on 7.2, throwing notices which get in the way of testing).

Nicholas K. Dionysopoulos

Lead Developer and Director

πŸ‡¬πŸ‡·Greek: native πŸ‡¬πŸ‡§English: excellent πŸ‡«πŸ‡·French: basic β€’ πŸ• My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

GJSchaller
The first method failed again, scheduling the alt method for later this afternoon, will report back when it runs.

GJSchaller
The alt version worked! Thank you!

Do you need anything else to troubleshoot, or are you OK with that as a solution?

nicholas
Akeeba Staff
Manager
OK, it's process limits as I suspected. There's nothing else for us to do.

The difference is that the alt script goes through the web server to run the backup. Instead of one, big, hungry process to run the backup now the server sees several small and much less unwieldy ones. Therefore it won't kill them. The alt script itself consumes almost no resources. It just sits there, waiting for each backup step process to end before starting the next.

The only caveat with this solution is with transferring files to remote storage. If you want to do that you will need to use smaller part sizes (around 10M) or a transfer engine which supports transferring backup archives to remote storage in chunks (e.g. Amazon S3, Dropbox, OneDrive, Google Drive, ...). If you try to transfer a big file, let's say 50M, in one go (like FTP does) you will get a timeout from the web server that's running the backup step processes.

Armed with that knowledge I think you'll have no problem running your backups!

Nicholas K. Dionysopoulos

Lead Developer and Director

πŸ‡¬πŸ‡·Greek: native πŸ‡¬πŸ‡§English: excellent πŸ‡«πŸ‡·French: basic β€’ πŸ• My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

System Task
system
This ticket has been automatically closed. All tickets which have been inactive for a long time are automatically closed. If you believe that this ticket was closed in error, please contact us.

Support Information

Working hours: We are open Monday to Friday, 9am to 7pm Cyprus timezone (EET / EEST). Support is provided by the same developers writing the software, all of which live in Europe. You can still file tickets outside of our working hours, but we cannot respond to them until we're back at the office.

Support policy: We would like to kindly inform you that when using our support you have already agreed to the Support Policy which is part of our Terms of Service. Thank you for your understanding and for helping us help you!