Support

Akeeba Backup for Joomla!

#31328 Best cloud supplier

Posted in ‘Akeeba Backup for Joomla! 4 & 5’
This is a public ticket

Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.

Environment Information

Joomla! version
n/a
PHP version
n/a
Akeeba Backup version
n/a

Latest post by dlb on Wednesday, 08 May 2019 13:52 CDT

pigos
Please look at the bottom of this page (under Support Policy Summary) for our support policy summary, containing important information regarding our working hours and our support policy. Thank you!

EXTREMELY IMPORTANT: Please attach a ZIP file containing your Akeeba Backup log file in order for us to help you with any backup or restoration issue. If the file is over 2Mb, please upload it on your server and post a link to it.

Description of my issue:
I have several school sites based on joomla to be backed up. The average backup size is from 1 to 2 Gb each
I tried several transfers to google drive, one drive, synology NAS through ftp.
In any case, with big files I got errors also chunking them, especially using SFTP. I understand the suggestion to chunk in very small pieces (10 to 50 Mb) but this generate a lot of files for any backup and I often I got errors also in that case.

Is there a cloud platform that you can suggest as more suitable to solve that problem?

If I generate the file on the server and then I use Filezilla to tranfer it on my local pc and then fron the pc to the cloud, this work but require al lot of work.

Any suggestion?

Many thanks and regards

dlb
I usually use Amazon S3, that is one of the best and the devil I know.

There are two terms used for file size, "part" and "chunk". The part size is set in the Archiver Engine configuration and physically splits the archive into multiple parts. All of the parts must be present to extract the archive. The chunk setting is within the Post Processing setup and breaks the archive into small parts, uploads that small part, then reassembles the parts into the single archive file on the cloud side. So you start with a single part and you end up with a single part, the chunks happen magically behind the scenes during the transfer. Not all cloud storage providers offer a chunked upload.

Your archive is normally stored in the Output folder during the backup, then is deleted after it is transferred to cloud storage.

Using FTP/SFTP for post processing is difficult. FTP and PHP really don't like one another. The FTP part has to completely upload the file within the PHP time limit or you get a timeout and nothing is transferred. FTP can't restart within PHP. So the usual problem with FTP/SFTP uploads is the part size is too big.

One advantage of S3 (and Dropbox, which uses S3 for storage) is that it can use chunks.

Generally speaking we can get the archive to upload to most cloud storage providers. I would need to see the Backup Log to see what is happening with the upload to know what adjustments to make to get it to succeed.


Dale L. Brackin
Support Specialist


us.gifEnglish: native


Please keep in mind my timezone and cultural differences when reading my replies. Thank you!


????
My time zone is EST (UTC -5) (click here to see my current time in Philadelphia, PA)

Support Information

Working hours: We are open Monday to Friday, 9am to 7pm Cyprus timezone (EET / EEST). Support is provided by the same developers writing the software, all of which live in Europe. You can still file tickets outside of our working hours, but we cannot respond to them until we're back at the office.

Support policy: We would like to kindly inform you that when using our support you have already agreed to the Support Policy which is part of our Terms of Service. Thank you for your understanding and for helping us help you!