Support

Akeeba Backup for Joomla!

#31173 Upload to FTP/FTPS Failing

Posted in ‘Akeeba Backup for Joomla! 4 & 5’
This is a public ticket

Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.

Environment Information

Joomla! version
n/a
PHP version
n/a
Akeeba Backup version
n/a

Latest post by on Friday, 26 July 2019 17:17 CDT

aashish108
Please look at the bottom of this page (under Support Policy Summary) for our support policy summary, containing important information regarding our working hours and our support policy. Thank you!

EXTREMELY IMPORTANT: Please attach a ZIP file containing your Akeeba Backup log file in order for us to help you with any backup or restoration issue. If the file is over 2Mb, please upload it on your server and post a link to it.

Description of my issue:

Hi, I have tried to add a post process of uploading to my NAS but it fails every time. I used FTPS (not curl versions as that has other errors). I am able to login to my NAS via FTPS though from another client and can upload. I use the same settings on Akeeba side and it fails. I have checked the firewall logs on the server and its not being blocked. I checked error logs and nothing of note. I checked my router logs and nothing. I checked my NAS logs and it snot being blocked and the permissions are ok!

Attached are several of my logs from my various attempts. The last attempt I tried via FTP without encryption and still it failed.

Note that when setting up I tried to test the connection and it all works. So all details are correct. I also made sure the directory on the FTP server had 777 permissions.

Any help please?

Many thanks!

tampe125
Akeeba Staff
Hello,

I suspect there's something blocking the outgoing connection from your site to your NAS. Can you please try to toggle the option Use passive mode inside your configuration?

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

aashish108
Hi, I was checking the firewall logs after I attempted a trial run of the backup that uploads to my NAS. It doesn't seem to be being blocked! I did check the server logs and nothing appears there also :/ I did try the passive mode on and off but really that made no impact. Not sure how to proceed?

Thanks

tampe125
Akeeba Staff
The problem with FTP transfers is that FTP is going to make 2 connections: one for control, another one for actual data transfer.
That second connection usually creates issues: it could be performed in passive or active mode, but there's no guarantee that it's actually possible to complete.
Here you can find a detailed explanation on what's going on: https://stackoverflow.com/questions/1699145/what-is-the-difference-between-active-and-passive-ftp
If both parts of your connections do not let opening custom ports, it's impossible to start the transfer.

For what is worth, does your NAS support WebDav protocol? It should be easier to setup since it works over the HTTP protocol.

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

aashish108
Hi, thanks for the info, was interesting. I checked my NAS and the backup files were there! It appears when the CRON runs automatically, it works. But when I manually trigger the backup profile, it fails to upload to my NAS via FTPS. Do you know whats going on here?

tampe125
Akeeba Staff
You're welcome!
FTP is one of those protocols designed back in 70s and then forced to modern standards, so sometimes there are edge cases that are hard to spot.

Please remember that the PHP executable that powers your website is different than the one used from CLI for CRON jobs (or at least has a different configuration). I guess there's a different policy that allows CLI to access to your NAS server.

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

aashish108
Hi, I believe it works in FTP mode but not in FTPS mode. No logs of anything anywhere being blocked :/ on server and on my NAS. I'll try webdav.

Thanks.

tampe125
Akeeba Staff
Just for clarity sake, there are different ways where the acronym FTP is used:
  • FTP Plain connection, not encrypted, using FTP protocol
  • FTPS Encrypted connection, using FTP protocol. It's the same difference between FTP and FTPS
  • SFTP This is a totally different protocol: it uses an SSH connection to transfer files. SSH is the method you use to get CLI access to servers.


Are you sure you are using FTPS and not SFTP?

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

aashish108
Hi, I am familiar with both as I use SFTP for other projects. I used FTPS for backup as its easier. I couldn't get FTPS to work with backup neither with SFTP :/ even via Cron. THough standard FTP worked. I ensured all ports were unblocked and checked server/NAS logs but nothing there indicates blocks.

I also tried Webdav but that does not work either! I did get it to work on my Mac though and the same settings I used for akeeba backup.

Any ideas what is going on pls? Below is a small portion of the log:

DEBUG |190403 10:25:47|Remote relative WebDav URL: iskcon-london.org/
DEBUG |190403 10:25:47|Absolute WebDav URL: https://redacted.space:5006/Web%20Backups/iskcon-london.org/
DEBUG |190403 10:25:47|Received the following exception while checking if the remote folder exists: 0 - [CURL] Error while making request: Failed to connect to redacted.space port 5006: Connection refused (error code: 7)
DEBUG |190403 10:25:47|Error code different than 404, this means that a real error occurred
WARNING |190403 10:25:47|Failed to upload kickstart.php
WARNING |190403 10:25:47|Error received from the post-processing engine:
WARNING |190403 10:25:47|Failed to process file <root>administrator/components/com_akeeba/backup/site-www.iskcon-london.org-20190403-102401utc.jpa \n Post-processing interrupted -- no more files will be transferred \n Failed to upload kickstart.php
DEBUG |190403 10:25:47|Akeeba\Engine\Core\Domain\Finalization::_run() Running built-in method apply_quotas

Cheers

tampe125
Akeeba Staff
If FTP works and FTPS doesn't, it means that there is something getting in the way.
This is confirmed by WebDav, where it says that the connection is blocked.

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

tampe125
Akeeba Staff
I'd suggest you to stick with plain FTP, if you want to be extra sure, you can use JPS archives, so even if the connection is not encrypted, the contents of your backup won't be revealed.

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

aashish108
Hi, I can't use plain FTP as that protocol is not allowed as per our security guidelines for our organisation and GDPR.

I am trying SFTP (SSH), and this is the error that I am getting, its ratehr non-descriptive. I am clicking the test connection button:


Could not connect to the remote SFTP server. The error message was:
Y

Do you know what this error means?

I filled out the private/public key fields using absolute URLs from the server.

Cheers

tampe125
Akeeba Staff
Hi, I can't use plain FTP as that protocol is not allowed as per our security guidelines for our organisation and GDPR.
You can workaround this issue by using JPS archives; actually you should use them, since GDPR requires data encryption at rest.
If you use encrypted archives, then it doesn't matter how you transfer them, since they are already encrypted.

Regarding your test message, that's pretty strange. If you try to take a backup, does it work?
If you want me to debug further, I'll have to connect to your site along with FTP (or equivalent) access. Is that something allowed by your company policy?

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

aashish108
Hi, I gave up the FTPS/SFTP/WebDav though the unencrypted FTP with encrypted backup file is a good idea.

However, I tried the Google Drive backup. It worked manually so I just saved it and made sure cron would fire properly daily. Manually it works but on cron it just uploads a single file where the manual process created loads of files of each 100MBs. Do you know what is going wrong pls?

Here is the log:

INFO |190407 23:00:59|Initializing post-processing engine
DEBUG |190407 23:00:59|132 files to process found
INFO |190407 23:00:59|Beginning post processing file <root>administrator/components/com_akeeba/backup/site-www.iskcon-london.org-20190407-230001utc.jpa
DEBUG |190407 23:00:59|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::initialiseConnector - Validating the Google Drive tokens
DEBUG |190407 23:00:59|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::initialiseConnector - Google Drive tokens were refreshed
DEBUG |190407 23:00:59|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Preparing to upload to Google Drive, file path = Backups/iskcon-london.org/20190407/site-www.iskcon-london.org-20190407-230001utc.jpa.
DEBUG |190407 23:01:05|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Google Drive folder ID = 17MoF14Wz1XRt7O0vQk1V5xonlPNTWmNn
DEBUG |190407 23:01:05|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Using chunked upload, part size 104857600
DEBUG |190407 23:01:05|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Trying to create possibly missing directories and remove existing file by the same name (Backups/iskcon-london.org/20190407/site-www.iskcon-london.org-20190407-230001utc.jpa)
DEBUG |190407 23:01:06|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Creating new upload session
DEBUG |190407 23:01:06|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - New upload session https://www.googleapis.com/upload/drive/v3/files?supportsTeamDrives=true&uploadType=resumable&upload_id=AEnB2UoztN3CSAG4pTu_w6KE-QjJc7IxWHrA0SmUlFhvuvdh8THEFTd7OQB6nwO_ovc8EZ1Q8Oz017cwduyBq2G5Y7No7cyYHQ
DEBUG |190407 23:01:06|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Uploading chunked part (offset:0 // chunk size: 104857600)
DEBUG |190407 23:01:08|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Got uploadPart result Array \n ( \n [kind] => drive#file \n [id] => 15pr0NhDiJ3-2szdjYo3GynbrqGsaNcnW \n [name] => site-www.iskcon-london.org-20190407-230001utc.jpa \n [mimeType] => application/octet-stream \n ) \n
DEBUG |190407 23:01:08|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Chunked upload is now complete
INFO |190407 23:01:08|Finished post-processing file <root>administrator/components/com_akeeba/backup/site-www.iskcon-london.org-20190407-230001utc.jpa
DEBUG |190407 23:01:08|Deleting already processed file <root>administrator/components/com_akeeba/backup/site-www.iskcon-london.org-20190407-230001utc.jpa
DEBUG |190407 23:01:10|Akeeba\Engine\Core\Domain\Finalization::_run() Running built-in method run_post_processing
DEBUG |190407 23:01:10|Loading post-processing engine object (googledrive)
INFO |190407 23:01:10|Beginning post processing file <root>administrator/components/com_akeeba/backup/site-www.iskcon-london.org-20190407-230001utc.j01
DEBUG |190407 23:01:10|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::initialiseConnector - Validating the Google Drive tokens
DEBUG |190407 23:01:10|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Using chunked upload, part size 104857600
DEBUG |190407 23:01:10|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Trying to create possibly missing directories and remove existing file by the same name (Backups/iskcon-london.org/20190407/site-www.iskcon-london.org-20190407-230001utc.j01)
DEBUG |190407 23:01:11|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Creating new upload session
DEBUG |190407 23:01:12|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - New upload session https://www.googleapis.com/upload/drive/v3/files?supportsTeamDrives=true&uploadType=resumable&upload_id=AEnB2UoEsr5crdPSY9EK0t1TJdxPvCoz6ey1pHcGykn8MBSuEoo-ShDJzLWol5yPScrZUGFhBQg0CSgZG0nXggdJJCcUy48QCQ
DEBUG |190407 23:01:12|Akeeba\Engine\Postproc\Googledrive::Akeeba\Engine\Postproc\Googledrive::processPart - Uploading chunked part (offset:0 // chunk size: 104857600)
--- END OF RAW LOG ---

tampe125
Akeeba Staff
Can you please attach the full log? A small excerpt of few lines usually isn't enough, I need the full context.

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

aashish108
Attached.

tampe125
Akeeba Staff
Nope :)
Can you please upload it somewhere and post here the link to it?

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

aashish108
Here is the file: https://send.firefox.com/download/8a605b98a8b3a106/#7XgAUKnEsc0yn0vjCm9hug

tampe125
Akeeba Staff
Looking at your backup log, the process simply dies after about 70-75 seconds. I suspect there is an hard timeout for CLI scripts imposed by your host. Can you please double check that?
However, the same profile should run fine if you launch if from the backend, can you please try that?

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

aashish108
Hi, I finally adjusted these parameters below for PHP:

Max execution time is 200
max_input_time is 200
session.gc_maxlifetime is 1440

Instead of uploading one file of 3 MB it can now upload 2 files of each 5MB! We don't want a script execution time that is too long? I already tried manually and it does work then but I want an automated approach. Any recommendations pls?

Here is the log if you need it: https://send.firefox.com/download/adf51cd47ecf31f1/#y7RUKmxAMFV8Kel6DbHSkg

Thanks

tampe125
Akeeba Staff
Again, the backup process dies after 65 seconds.
When the same profile works from backend but dies during the CLI it means that the server is killing the process.
You should get in touch with your host/sysadmin and report the issue.

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

aashish108
HI, thanks for the help - its all working now. The actual issue was that it was running out of memory even though the manual way didn't.

tampe125
Akeeba Staff
That makes sense: in the manual way the full backup is performed with several page refreshes, so the memory is wiped out every time.
In the CLI environment everything happens in one single load, so in some rare cases you could face memory issues.

I'm glad you fixed the issue!

Davide Tampellini

Developer and Support Staff

๐Ÿ‡ฎ๐Ÿ‡นItalian: native ๐Ÿ‡ฌ๐Ÿ‡งEnglish: good โ€ข ๐Ÿ• My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

System Task
system
This ticket has been automatically closed. All tickets which have been inactive for a long time are automatically closed. If you believe that this ticket was closed in error, please contact us.

System Task
system
This ticket has been automatically closed. All tickets which have been inactive for a long time are automatically closed. If you believe that this ticket was closed in error, please contact us.

System Task
system
This ticket has been automatically closed. All tickets which have been inactive for a long time are automatically closed. If you believe that this ticket was closed in error, please contact us.

Support Information

Working hours: We are open Monday to Friday, 9am to 7pm Cyprus timezone (EET / EEST). Support is provided by the same developers writing the software, all of which live in Europe. You can still file tickets outside of our working hours, but we cannot respond to them until we're back at the office.

Support policy: We would like to kindly inform you that when using our support you have already agreed to the Support Policy which is part of our Terms of Service. Thank you for your understanding and for helping us help you!