Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: Timeout (data socket) when downloading to a stream #256

Open
williamsdyyz opened this issue Aug 21, 2024 · 5 comments
Open

Error: Timeout (data socket) when downloading to a stream #256

williamsdyyz opened this issue Aug 21, 2024 · 5 comments

Comments

@williamsdyyz
Copy link

williamsdyyz commented Aug 21, 2024

Downloading a zip file to a stream. The stream pipeline then unzips and processes the data which takes 2-3 minutes.

What I'm seeing is a successful connection and the zip file downloads quickly. Then while the rest of the pipeline is still processing, error happens and the end of the stream seems to be corrupted.

This happens with the 3rd party FTP server I'm trying to access and also with my local FileZilla test server.

Example code

This code fails with the socket error and the stream data is corrupted. The exact same code with a local file stream works.

client.downloadTo(unzipStream, "FILE.zip");

const pipeline = chain([
  unzipStream,
  csvParser(),
  ...
  ...
])

pipeline.on("data", (data) => {
  // Process the data
}

Console output

Connected to xx.xx.xx.200:21 (No encryption)
< 220 Microsoft FTP Service

> OPTS UTF8 ON
< 200 OPTS UTF8 command successful - UTF8 encoding now ON.

Login security: No encryption
> USER xxx
< 331 Password required

> PASS ###
< 230 User logged in.

> FEAT
< 211-Extended features supported:

<  LANG EN*
 UTF8
 AUTH TLS;TLS-C;SSL;TLS-P;
 PBSZ
 PROT C;P;
 CCC
 HOST
 SIZE
 MDTM
 REST STREAM
211 END

> TYPE I
< 200 Type set to I.

> STRU F
< 200 STRU F ok.

> OPTS UTF8 ON
< 200 OPTS UTF8 command successful - UTF8 encoding now ON.

Trying to find optimal transfer strategy...
> EPSV
< 229 Entering Extended Passive Mode (|||50060|)

Optimal transfer strategy found.
> RETR FILE.zip
< 125 Data connection already open; Transfer starting.

Downloading from xx.xx.xx.200:50060 (No encryption)
< 226 Transfer complete.

> QUIT
Error: Timeout (data socket)
    at Socket.<anonymous> (/home/x/node_modules/basic-ftp/dist/FtpContext.js:319:33)
    at Object.onceWrapper (node:events:628:28)
    at Socket.emit (node:events:514:28)
    at Socket.emit (node:domain:552:15)
    at Socket._onTimeout (node:net:589:8)
    at listOnTimeout (node:internal/timers:573:17)
    at process.processTimers (node:internal/timers:514:7)

Which version of Node.js are you using?
20.5.1

Which version of basic-ftp are you using?
5.0.5

Additional context
The FTP server is not in my control. All I know is "Microsoft FTP Service"
My project is based on @google-cloud/functions-framework to be deployed as a cloud function.
I'm using the stream-chain package to simplify stream handling. I tried a regular node pipeline with the same result.

Additional info after further investigation
The issue seems happen when basic-ftp is used with unzipper. When I switched out unzipper for zlib (and changed to a .gz file) everything works as expected.

I'm using unzipper like this
const unzipStream = ParseOne(/^FILE.zip$/)

It's unclear whether this is a basic-ftp issue or not. All I can say is I've used unzipper successfully with ssh2-sftp-client

@patrickjuchli
Copy link
Owner

Do you end/finish your target stream when it's done?

@jimmysafe
Copy link

jimmysafe commented Sep 1, 2024

@patrickjuchli having the same issue. It was working few weeks ago, now the request times out.
Here is my code:

  const client = new Client();
  client.ftp.verbose = false;
  await client.access({
    host: "MY-HOST-IP",
    user: "MY-HOST-USER",
    password: "MY-HOST-PASSWORD",
  });

  const stream = new PassThrough();
  #THIS IS WHERE IT HANGS FOREVER
  await client.downloadTo(stream, "compatta.xml");

  let data = "";
  const streamPromise = new Promise((resolve, reject) => {
    stream.on("data", (chunk) => {
      data += chunk;
    });

    stream.on("end", () => {
      const result = convert.xml2json(data, {
        compact: true,
        spaces: 2,
        ignoreDeclaration: true,
      });
      resolve(result);
    });

    stream.on("error", (error) => {
      reject(error);
    });
  });

@williamsdyyz
Copy link
Author

Do you end/finish your target stream when it's done?

You mean the last stream in the pipeline? No. Everything in the pipeline is a 3rd party NPM package. I just consume the result with on("data"...

@patrickjuchli
Copy link
Owner

@jimmysafe, there hasn't been a new release in the last weeks or months. Do you know which last version worked on your end?

@emilguareno
Copy link

I'm getting the same issue here with slightly larger files, around 15mb to 20mb but for me it happens during upload.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants