[GH-ISSUE #18] Script process is running for hours but never calls OpenAI API #16

Closed
opened 2026-03-03 13:52:13 +03:00 by kerem · 11 comments
Owner

Originally created by @JungleGenius on GitHub (Jun 26, 2024).
Original GitHub issue: https://github.com/jehna/humanify/issues/18

Hi Jehna,

Thank you for sharing this project.

I'm running this on a Debian Bookworm VM on my Intel Mac.

The script starts and creates the folder named after the output file with the deobfuscated.js in it. It also creates several temp files in /tmp/tsx-0. When I look at the process list call stack I see esbuild as the top most process being run. The top command shows it using CPU/memory and the logs show the following:

0 verbose cli /usr/bin/node /usr/bin/npm
1 info using npm@9.2.0
2 info using node@v18.13.0
3 timing npm:load:whichnode Completed in 1ms
4 timing config:load:defaults Completed in 2ms
5 timing config:load:file:/usr/share/nodejs/npm/npmrc Completed in 4ms
6 timing config:load:builtin Completed in 4ms
7 timing config:load:cli Completed in 1ms
8 timing config:load:env Completed in 0ms
9 timing config:load:file:/root/decode/humanify/.npmrc Completed in 1ms
10 timing config:load:project Completed in 3ms
11 timing config:load:file:/root/.npmrc Completed in 0ms
12 timing config:load:user Completed in 0ms
13 timing config:load:file:/etc/npmrc Completed in 0ms
14 timing config:load:global Completed in 0ms
15 timing config:load:setEnvs Completed in 1ms
16 timing config:load Completed in 12ms
17 timing npm:load:configload Completed in 12ms
18 timing npm:load:mkdirpcache Completed in 1ms
19 timing npm:load:mkdirplogs Completed in 0ms
20 verbose title npm start --key=sk-redacted --4k -o x.js x.js.org
21 verbose argv "start" "--" "--key=sk-redacted" "--4k" "-o" "x.js" "x.js.org"
22 timing npm:load:setTitle Completed in 1ms
23 timing config:load:flatten Completed in 3ms
24 timing npm:load:display Completed in 4ms
25 verbose logfile logs-max:10 dir:/root/.npm/_logs/2024-06-26T15_16_38_973Z-
26 verbose logfile /root/.npm/_logs/2024-06-26T15_16_38_973Z-debug-0.log
27 timing npm:load:logFile Completed in 5ms
28 timing npm:load:timers Completed in 0ms
29 timing npm:load:configScope Completed in 0ms
30 timing npm:load Completed in 24ms
31 silly logfile start cleaning logs, removing 1 files
32 timing config:load:flatten Completed in 0ms
33 silly logfile done cleaning log files

However, When I look at my network traffic from the VM I don't see any connections to OpenAI's API endpoint.

Is it normal for it to run this long (several hours) with no errors being thrown and no API calls??

I'm not a regular node user (JavaScript, but not Node). So I'm not sure how to debug this or if something is amiss.

Is it stuck in a loop?

Thanks

-Daniel

Originally created by @JungleGenius on GitHub (Jun 26, 2024). Original GitHub issue: https://github.com/jehna/humanify/issues/18 Hi Jehna, Thank you for sharing this project. I'm running this on a Debian Bookworm VM on my Intel Mac. The script starts and creates the folder named after the output file with the `deobfuscated.js` in it. It also creates several temp files in `/tmp/tsx-0`. When I look at the process list call stack I see `esbuild` as the top most process being run. The `top` command shows it using CPU/memory and the logs show the following: ``` 0 verbose cli /usr/bin/node /usr/bin/npm 1 info using npm@9.2.0 2 info using node@v18.13.0 3 timing npm:load:whichnode Completed in 1ms 4 timing config:load:defaults Completed in 2ms 5 timing config:load:file:/usr/share/nodejs/npm/npmrc Completed in 4ms 6 timing config:load:builtin Completed in 4ms 7 timing config:load:cli Completed in 1ms 8 timing config:load:env Completed in 0ms 9 timing config:load:file:/root/decode/humanify/.npmrc Completed in 1ms 10 timing config:load:project Completed in 3ms 11 timing config:load:file:/root/.npmrc Completed in 0ms 12 timing config:load:user Completed in 0ms 13 timing config:load:file:/etc/npmrc Completed in 0ms 14 timing config:load:global Completed in 0ms 15 timing config:load:setEnvs Completed in 1ms 16 timing config:load Completed in 12ms 17 timing npm:load:configload Completed in 12ms 18 timing npm:load:mkdirpcache Completed in 1ms 19 timing npm:load:mkdirplogs Completed in 0ms 20 verbose title npm start --key=sk-redacted --4k -o x.js x.js.org 21 verbose argv "start" "--" "--key=sk-redacted" "--4k" "-o" "x.js" "x.js.org" 22 timing npm:load:setTitle Completed in 1ms 23 timing config:load:flatten Completed in 3ms 24 timing npm:load:display Completed in 4ms 25 verbose logfile logs-max:10 dir:/root/.npm/_logs/2024-06-26T15_16_38_973Z- 26 verbose logfile /root/.npm/_logs/2024-06-26T15_16_38_973Z-debug-0.log 27 timing npm:load:logFile Completed in 5ms 28 timing npm:load:timers Completed in 0ms 29 timing npm:load:configScope Completed in 0ms 30 timing npm:load Completed in 24ms 31 silly logfile start cleaning logs, removing 1 files 32 timing config:load:flatten Completed in 0ms 33 silly logfile done cleaning log files ``` However, When I look at my network traffic from the VM I don't see any connections to OpenAI's API endpoint. Is it normal for it to run this long (several hours) with no errors being thrown and no API calls?? I'm not a regular node user (JavaScript, but not Node). So I'm not sure how to debug this or if something is amiss. Is it stuck in a loop? Thanks -Daniel
kerem closed this issue 2026-03-03 13:52:14 +03:00
Author
Owner

@jehna commented on GitHub (Jun 26, 2024):

Which flags are you using to run humanify?

<!-- gh-comment-id:2192313929 --> @jehna commented on GitHub (Jun 26, 2024): Which flags are you using to run humanify?
Author
Owner

@jehna commented on GitHub (Jun 26, 2024):

Oh, right the verbose output has argv output too. Let me check if that still works on my machine

<!-- gh-comment-id:2192321901 --> @jehna commented on GitHub (Jun 26, 2024): Oh, right the verbose output has argv output too. Let me check if that still works on my machine
Author
Owner

@JungleGenius commented on GitHub (Jun 26, 2024):

npm start --key=sk-redacted --4k -o x.js x.js.org

<!-- gh-comment-id:2192329561 --> @JungleGenius commented on GitHub (Jun 26, 2024): `npm start --key=sk-redacted --4k -o x.js x.js.org`
Author
Owner

@JungleGenius commented on GitHub (Jun 26, 2024):

@jehna Look like it was grinding away with 65,000+ tokens. However, It died.

      url: 'https://api.openai.com/v1/chat/completions'
    },
    request: <ref *2> ClientRequest {
      _events: [Object: null prototype] {
        abort: [Function (anonymous)],
        aborted: [Function (anonymous)],
        connect: [Function (anonymous)],
        error: [Function (anonymous)],
        socket: [Function (anonymous)],
        timeout: [Function (anonymous)],
        finish: [Function: requestOnFinish]
      },
      _eventsCount: 7,
      _maxListeners: undefined,
      outputData: [],
      outputSize: 0,
      writable: true,
      destroyed: false,
      _last: true,
      chunkedEncoding: false,
      shouldKeepAlive: false,
      maxRequestsOnConnectionReached: false,
      _defaultKeepAlive: true,
      useChunkedEncodingByDefault: true,
      sendDate: false,
      _removedConnection: false,
      _removedContLen: false,
      _removedTE: false,
      strictContentLength: false,
      _contentLength: 2628,
      _hasBody: true,
      _trailer: '',
      finished: true,
      _headerSent: true,
      _closed: false,
      socket: <ref *1> TLSSocket {
        _tlsOptions: {
          allowHalfOpen: undefined,
          pipe: false,
          secureContext: SecureContext { context: SecureContext {} },
          isServer: false,
          requestCert: true,
          rejectUnauthorized: true,
          session: Buffer(1737) [Uint8Array] [
             48, 130,   6, 197,   2,   1,   1,   2,   2,   3,   4,   4,
              2,  19,   2,   4,  32,  70, 145,  89,  53, 171, 164,  45,
             32, 245, 145,  80, 250,  97,  94, 143,  83,  40,  17, 148,
             98, 155, 251, 177, 114, 211, 136, 136,  82,  40,  98, 220,
             92,   4,  48, 158, 151,  61, 236,   1, 160,  77, 213, 156,
            112, 149,  89, 208, 110, 114,  82, 102,  25,  38,  49, 191,
            131,  30, 167, 147,  13,   4, 112,  95,  72,   6, 180, 117,
            193, 174, 160, 144, 218, 242,  62,  77,  50, 108,  72,  58,
             12,  48,  41, 161,
            ... 1637 more items
          ],
          ALPNProtocols: undefined,
          requestOCSP: undefined,
          enableTrace: undefined,
          pskCallback: undefined,
          highWaterMark: undefined,
          onread: undefined,
          signal: undefined
        },
        _secureEstablished: true,
        _securePending: false,
        _newSessionPending: false,
        _controlReleased: true,
        secureConnecting: false,
        _SNICallback: null,
        servername: 'api.openai.com',
        alpnProtocol: false,
        authorized: true,
        authorizationError: null,
        encrypted: true,
        _events: [Object: null prototype] {
          close: [
            [Function: onSocketCloseDestroySSL],
            [Function],
            [Function: onClose],
            [Function: socketCloseListener]
          ],
          end: [Function: onReadableStreamEnd],
          newListener: [Function: keylogNewListener],
          secure: [Function: onConnectSecure],
          session: [Function (anonymous)],
          free: [Function: onFree],
          timeout: [Function: onTimeout],
          agentRemove: [Function: onRemove],
          error: [Function: socketErrorListener],
          finish: [Function: bound onceWrapper] {
            listener: [Function: destroy]
          }
        },
        _eventsCount: 10,
        connecting: false,
        _hadError: false,
        _parent: null,
        _host: 'api.openai.com',
        _closeAfterHandlingError: false,
        _readableState: ReadableState {
          objectMode: false,
          highWaterMark: 16384,
          buffer: BufferList { head: null, tail: null, length: 0 },
          length: 0,
          pipes: [],
          flowing: true,
          ended: false,
          endEmitted: false,
          reading: true,
          constructed: true,
          sync: false,
          needReadable: true,
          emittedReadable: false,
          readableListening: false,
          resumeScheduled: false,
          errorEmitted: false,
          emitClose: false,
          autoDestroy: true,
          destroyed: false,
          errored: null,
          closed: false,
          closeEmitted: false,
          defaultEncoding: 'utf8',
          awaitDrainWriters: null,
          multiAwaitDrain: false,
          readingMore: false,
          dataEmitted: true,
          decoder: null,
          encoding: null,
          [Symbol(kPaused)]: false
        },
        _maxListeners: undefined,
        _writableState: WritableState {
          objectMode: false,
          highWaterMark: 16384,
          finalCalled: true,
          needDrain: false,
          ending: true,
          ended: true,
          finished: false,
          destroyed: false,
          decodeStrings: false,
          defaultEncoding: 'utf8',
          length: 0,
          writing: false,
          corked: 0,
          sync: false,
          bufferProcessing: false,
          onwrite: [Function: bound onwrite],
          writecb: null,
          writelen: 0,
          afterWriteTickInfo: null,
          buffered: [],
          bufferedIndex: 0,
          allBuffers: true,
          allNoop: true,
          pendingcb: 1,
          constructed: true,
          prefinished: false,
          errorEmitted: false,
          emitClose: false,
          autoDestroy: true,
          errored: null,
          closed: false,
          closeEmitted: false,
          [Symbol(kOnFinished)]: []
        },
        allowHalfOpen: false,
        _sockname: null,
        _pendingData: null,
        _pendingEncoding: '',
        server: undefined,
        _server: null,
        ssl: TLSWrap {
          _parent: TCP {
            reading: [Getter/Setter],
            onconnection: null,
            [Symbol(owner_symbol)]: [Circular *1]
          },
          _parentWrap: undefined,
          _secureContext: SecureContext { context: SecureContext {} },
          reading: true,
          onkeylog: [Function: onkeylog],
          onhandshakestart: {},
          onhandshakedone: [Function (anonymous)],
          onocspresponse: [Function: onocspresponse],
          onnewsession: [Function: onnewsessionclient],
          onerror: [Function: onerror],
          [Symbol(owner_symbol)]: [Circular *1]
        },
        _requestCert: true,
        _rejectUnauthorized: true,
        parser: null,
        _httpMessage: [Circular *2],
        [Symbol(res)]: TLSWrap {
          _parent: TCP {
            reading: [Getter/Setter],
            onconnection: null,
            [Symbol(owner_symbol)]: [Circular *1]
          },
          _parentWrap: undefined,
          _secureContext: SecureContext { context: SecureContext {} },
          reading: true,
          onkeylog: [Function: onkeylog],
          onhandshakestart: {},
          onhandshakedone: [Function (anonymous)],
          onocspresponse: [Function: onocspresponse],
          onnewsession: [Function: onnewsessionclient],
          onerror: [Function: onerror],
          [Symbol(owner_symbol)]: [Circular *1]
        },
        [Symbol(verified)]: true,
        [Symbol(pendingSession)]: null,
        [Symbol(async_id_symbol)]: 4879,
        [Symbol(kHandle)]: TLSWrap {
          _parent: TCP {
            reading: [Getter/Setter],
            onconnection: null,
            [Symbol(owner_symbol)]: [Circular *1]
          },
          _parentWrap: undefined,
          _secureContext: SecureContext { context: SecureContext {} },
          reading: true,
          onkeylog: [Function: onkeylog],
          onhandshakestart: {},
          onhandshakedone: [Function (anonymous)],
          onocspresponse: [Function: onocspresponse],
          onnewsession: [Function: onnewsessionclient],
          onerror: [Function: onerror],
          [Symbol(owner_symbol)]: [Circular *1]
        },
        [Symbol(lastWriteQueueSize)]: 0,
        [Symbol(timeout)]: null,
        [Symbol(kBuffer)]: null,
        [Symbol(kBufferCb)]: null,
        [Symbol(kBufferGen)]: null,
        [Symbol(kCapture)]: false,
        [Symbol(kSetNoDelay)]: false,
        [Symbol(kSetKeepAlive)]: true,
        [Symbol(kSetKeepAliveInitialDelay)]: 60,
        [Symbol(kBytesRead)]: 0,
        [Symbol(kBytesWritten)]: 0,
        [Symbol(connect-options)]: {
          rejectUnauthorized: true,
          ciphers: 'TLS_AES_256_GCM_SHA384:TLS_CHACHA20_POLY1305_SHA256:TLS_AES_128_GCM_SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA256:ECDHE-RSA-AES256-SHA384:DHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA256:HIGH:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!SRP:!CAMELLIA',
          checkServerIdentity: [Function: checkServerIdentity],
          minDHSize: 1024,
          session: Buffer(1737) [Uint8Array] [
             48, 130,   6, 197,   2,   1,   1,   2,   2,   3,   4,   4,
              2,  19,   2,   4,  32,  70, 145,  89,  53, 171, 164,  45,
             32, 245, 145,  80, 250,  97,  94, 143,  83,  40,  17, 148,
             98, 155, 251, 177, 114, 211, 136, 136,  82,  40,  98, 220,
             92,   4,  48, 158, 151,  61, 236,   1, 160,  77, 213, 156,
            112, 149,  89, 208, 110, 114,  82, 102,  25,  38,  49, 191,
            131,  30, 167, 147,  13,   4, 112,  95,  72,   6, 180, 117,
            193, 174, 160, 144, 218, 242,  62,  77,  50, 108,  72,  58,
             12,  48,  41, 161,
            ... 1637 more items
          ],
          maxRedirects: 21,
          maxBodyLength: 10485760,
          protocol: 'https:',
          path: null,
          method: 'POST',
          headers: {
            Accept: 'application/json, text/plain, */*',
            'Content-Type': 'application/json',
            'User-Agent': 'OpenAI/NodeJS/3.3.0',
            Authorization: 'Bearer sk-redacted',
            'Content-Length': 2628
          },
          agent: undefined,
          agents: { http: undefined, https: undefined },
          auth: undefined,
          hostname: 'api.openai.com',
          port: 443,
          nativeProtocols: { 'http:': [Object], 'https:': [Object] },
          pathname: '/v1/chat/completions',
          _defaultAgent: Agent {
            _events: [Object: null prototype],
            _eventsCount: 2,
            _maxListeners: undefined,
            defaultPort: 443,
            protocol: 'https:',
            options: [Object: null prototype],
            requests: [Object: null prototype] {},
            sockets: [Object: null prototype],
            freeSockets: [Object: null prototype] {},
            keepAliveMsecs: 1000,
            keepAlive: false,
            maxSockets: Infinity,
            maxFreeSockets: 256,
            scheduling: 'lifo',
            maxTotalSockets: Infinity,
            totalSocketCount: 4,
            maxCachedSessions: 100,
            _sessionCache: [Object],
            [Symbol(kCapture)]: false
          },
          host: 'api.openai.com',
          noDelay: true,
          servername: 'api.openai.com',
          _agentKey: 'api.openai.com:443:::::::::::::::::::::',
          encoding: null,
          singleUse: true
        }
      },
      _header: 'POST /v1/chat/completions HTTP/1.1\r\n' +
        'Accept: application/json, text/plain, */*\r\n' +
        'Content-Type: application/json\r\n' +
        'User-Agent: OpenAI/NodeJS/3.3.0\r\n' +
        'Authorization: Bearer sk-redacted\r\n' +
        'Content-Length: 2628\r\n' +
        'Host: api.openai.com\r\n' +
        'Connection: close\r\n' +
        '\r\n',
      _keepAliveTimeout: 0,
      _onPendingData: [Function: nop],
      agent: Agent {
        _events: [Object: null prototype] {
          free: [Function (anonymous)],
          newListener: [Function: maybeEnableKeylog]
        },
        _eventsCount: 2,
        _maxListeners: undefined,
        defaultPort: 443,
        protocol: 'https:',
        options: [Object: null prototype] { noDelay: true, path: null },
        requests: [Object: null prototype] {},
        sockets: [Object: null prototype] {
          'api.openai.com:443:::::::::::::::::::::': [ [TLSSocket], [TLSSocket], [TLSSocket], [TLSSocket] ]
        },
        freeSockets: [Object: null prototype] {},
        keepAliveMsecs: 1000,
        keepAlive: false,
        maxSockets: Infinity,
        maxFreeSockets: 256,
        scheduling: 'lifo',
        maxTotalSockets: Infinity,
        totalSocketCount: 4,
        maxCachedSessions: 100,
        _sessionCache: {
          map: {
            'api.openai.com:443:::::::::::::::::::::': [Buffer [Uint8Array]]
          },
          list: [ 'api.openai.com:443:::::::::::::::::::::' ]
        },
        [Symbol(kCapture)]: false
      },
      socketPath: undefined,
      method: 'POST',
      maxHeaderSize: undefined,
      insecureHTTPParser: undefined,
      path: '/v1/chat/completions',
      _ended: true,
      res: IncomingMessage {
        _readableState: ReadableState {
          objectMode: false,
          highWaterMark: 16384,
          buffer: BufferList { head: null, tail: null, length: 0 },
          length: 0,
          pipes: [],
          flowing: true,
          ended: true,
          endEmitted: true,
          reading: false,
          constructed: true,
          sync: true,
          needReadable: false,
          emittedReadable: false,
          readableListening: false,
          resumeScheduled: false,
          errorEmitted: false,
          emitClose: true,
          autoDestroy: true,
          destroyed: true,
          errored: null,
          closed: true,
          closeEmitted: true,
          defaultEncoding: 'utf8',
          awaitDrainWriters: null,
          multiAwaitDrain: false,
          readingMore: true,
          dataEmitted: true,
          decoder: null,
          encoding: null,
          [Symbol(kPaused)]: false
        },
        _events: [Object: null prototype] {
          end: [ [Function: responseOnEnd], [Function: handleStreamEnd] ],
          data: [Function: handleStreamData],
          aborted: [Function: handlerStreamAborted],
          error: [Function: handleStreamError]
        },
        _eventsCount: 4,
        _maxListeners: undefined,
        socket: <ref *1> TLSSocket {
          _tlsOptions: {
            allowHalfOpen: undefined,
            pipe: false,
            secureContext: [SecureContext],
            isServer: false,
            requestCert: true,
            rejectUnauthorized: true,
            session: [Buffer [Uint8Array]],
            ALPNProtocols: undefined,
            requestOCSP: undefined,
            enableTrace: undefined,
            pskCallback: undefined,
            highWaterMark: undefined,
            onread: undefined,
            signal: undefined
          },
          _secureEstablished: true,
          _securePending: false,
          _newSessionPending: false,
          _controlReleased: true,
          secureConnecting: false,
          _SNICallback: null,
          servername: 'api.openai.com',
          alpnProtocol: false,
          authorized: true,
          authorizationError: null,
          encrypted: true,
          _events: [Object: null prototype] {
            close: [Array],
            end: [Function: onReadableStreamEnd],
            newListener: [Function: keylogNewListener],
            secure: [Function: onConnectSecure],
            session: [Function (anonymous)],
            free: [Function: onFree],
            timeout: [Function: onTimeout],
            agentRemove: [Function: onRemove],
            error: [Function: socketErrorListener],
            finish: [Function]
          },
          _eventsCount: 10,
          connecting: false,
          _hadError: false,
          _parent: null,
          _host: 'api.openai.com',
          _closeAfterHandlingError: false,
          _readableState: ReadableState {
            objectMode: false,
            highWaterMark: 16384,
            buffer: [BufferList],
            length: 0,
            pipes: [],
            flowing: true,
            ended: false,
            endEmitted: false,
            reading: true,
            constructed: true,
            sync: false,
            needReadable: true,
            emittedReadable: false,
            readableListening: false,
            resumeScheduled: false,
            errorEmitted: false,
            emitClose: false,
            autoDestroy: true,
            destroyed: false,
            errored: null,
            closed: false,
            closeEmitted: false,
            defaultEncoding: 'utf8',
            awaitDrainWriters: null,
            multiAwaitDrain: false,
            readingMore: false,
            dataEmitted: true,
            decoder: null,
            encoding: null,
            [Symbol(kPaused)]: false
          },
          _maxListeners: undefined,
          _writableState: WritableState {
            objectMode: false,
            highWaterMark: 16384,
            finalCalled: true,
            needDrain: false,
            ending: true,
            ended: true,
            finished: false,
            destroyed: false,
            decodeStrings: false,
            defaultEncoding: 'utf8',
            length: 0,
            writing: false,
            corked: 0,
            sync: false,
            bufferProcessing: false,
            onwrite: [Function: bound onwrite],
            writecb: null,
            writelen: 0,
            afterWriteTickInfo: null,
            buffered: [],
            bufferedIndex: 0,
            allBuffers: true,
            allNoop: true,
            pendingcb: 1,
            constructed: true,
            prefinished: false,
            errorEmitted: false,
            emitClose: false,
            autoDestroy: true,
            errored: null,
            closed: false,
            closeEmitted: false,
            [Symbol(kOnFinished)]: []
          },
          allowHalfOpen: false,
          _sockname: null,
          _pendingData: null,
          _pendingEncoding: '',
          server: undefined,
          _server: null,
          ssl: TLSWrap {
            _parent: [TCP],
            _parentWrap: undefined,
            _secureContext: [SecureContext],
            reading: true,
            onkeylog: [Function: onkeylog],
            onhandshakestart: {},
            onhandshakedone: [Function (anonymous)],
            onocspresponse: [Function: onocspresponse],
            onnewsession: [Function: onnewsessionclient],
            onerror: [Function: onerror],
            [Symbol(owner_symbol)]: [Circular *1]
          },
          _requestCert: true,
          _rejectUnauthorized: true,
          parser: null,
          _httpMessage: [Circular *2],
          [Symbol(res)]: TLSWrap {
            _parent: [TCP],
            _parentWrap: undefined,
            _secureContext: [SecureContext],
            reading: true,
            onkeylog: [Function: onkeylog],
            onhandshakestart: {},
            onhandshakedone: [Function (anonymous)],
            onocspresponse: [Function: onocspresponse],
            onnewsession: [Function: onnewsessionclient],
            onerror: [Function: onerror],
            [Symbol(owner_symbol)]: [Circular *1]
          },
          [Symbol(verified)]: true,
          [Symbol(pendingSession)]: null,
          [Symbol(async_id_symbol)]: 4879,
          [Symbol(kHandle)]: TLSWrap {
            _parent: [TCP],
            _parentWrap: undefined,
            _secureContext: [SecureContext],
            reading: true,
            onkeylog: [Function: onkeylog],
            onhandshakestart: {},
            onhandshakedone: [Function (anonymous)],
            onocspresponse: [Function: onocspresponse],
            onnewsession: [Function: onnewsessionclient],
            onerror: [Function: onerror],
            [Symbol(owner_symbol)]: [Circular *1]
          },
          [Symbol(lastWriteQueueSize)]: 0,
          [Symbol(timeout)]: null,
          [Symbol(kBuffer)]: null,
          [Symbol(kBufferCb)]: null,
          [Symbol(kBufferGen)]: null,
          [Symbol(kCapture)]: false,
          [Symbol(kSetNoDelay)]: false,
          [Symbol(kSetKeepAlive)]: true,
          [Symbol(kSetKeepAliveInitialDelay)]: 60,
          [Symbol(kBytesRead)]: 0,
          [Symbol(kBytesWritten)]: 0,
          [Symbol(connect-options)]: {
            rejectUnauthorized: true,
            ciphers: 'TLS_AES_256_GCM_SHA384:TLS_CHACHA20_POLY1305_SHA256:TLS_AES_128_GCM_SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA256:ECDHE-RSA-AES256-SHA384:DHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA256:HIGH:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!SRP:!CAMELLIA',
            checkServerIdentity: [Function: checkServerIdentity],
            minDHSize: 1024,
            session: [Buffer [Uint8Array]],
            maxRedirects: 21,
            maxBodyLength: 10485760,
            protocol: 'https:',
            path: null,
            method: 'POST',
            headers: [Object],
            agent: undefined,
            agents: [Object],
            auth: undefined,
            hostname: 'api.openai.com',
            port: 443,
            nativeProtocols: [Object],
            pathname: '/v1/chat/completions',
            _defaultAgent: [Agent],
            host: 'api.openai.com',
            noDelay: true,
            servername: 'api.openai.com',
            _agentKey: 'api.openai.com:443:::::::::::::::::::::',
            encoding: null,
            singleUse: true
          }
        },
        httpVersionMajor: 1,
        httpVersionMinor: 1,
        httpVersion: '1.1',
        complete: true,
        rawHeaders: [
          'Date',
          'Wed, 26 Jun 2024 21:41:32 GMT',
          'Content-Type',
          'application/json',
          'Content-Length',
          '211',
          'Connection',
          'close',
          'openai-organization',
          'jungle-genuis-llc',
          'openai-processing-ms',
          '1740',
          'openai-version',
          '2020-10-01',
          'strict-transport-security',
          'max-age=31536000; includeSubDomains',
          'x-ratelimit-limit-requests',
          '5000',
          'x-ratelimit-limit-tokens',
          '160000',
          'x-ratelimit-remaining-requests',
          '4994',
          'x-ratelimit-remaining-tokens',
          '156418',
          'x-ratelimit-reset-requests',
          '66ms',
          'x-ratelimit-reset-tokens',
          '1.342s',
          'x-request-id',
          '26aaa87d9eb8261ce239f850e96bf236',
          'CF-Cache-Status',
          'DYNAMIC',
          'Set-Cookie',
          '__cf_bm=WtuEb0w1wdxYCXSGvJVlfY0UN9AUYREDxE_ARbeysek-1719438092-1.0.1.1-PCOADWsjsP8M_QaPxlZiKY.V.4SRQLDJhDrbTD09OX8.aff_BXY_LumN9p0x377_t4t5ZflTwJkl30BHYsq3Dw; path=/; expires=Wed, 26-Jun-24 22:11:32 GMT; domain=.api.openai.com; HttpOnly; Secure; SameSite=None',
          'Set-Cookie',
          '_cfuvid=DF.3uPC9WgyTRTvzmj0rBFzZijvXlHHNuu5GIMmDb2Y-1719438092270-0.0.1.1-604800000; path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None',
          'Server',
          'cloudflare',
          'CF-RAY',
          '89a05ca089e621d9-MIA',
          'alt-svc',
          'h3=":443"; ma=86400'
        ],
        rawTrailers: [],
        aborted: false,
        upgrade: false,
        url: '',
        method: null,
        statusCode: 500,
        statusMessage: 'Internal Server Error',
        client: <ref *1> TLSSocket {
          _tlsOptions: {
            allowHalfOpen: undefined,
            pipe: false,
            secureContext: [SecureContext],
            isServer: false,
            requestCert: true,
            rejectUnauthorized: true,
            session: [Buffer [Uint8Array]],
            ALPNProtocols: undefined,
            requestOCSP: undefined,
            enableTrace: undefined,
            pskCallback: undefined,
            highWaterMark: undefined,
            onread: undefined,
            signal: undefined
          },
          _secureEstablished: true,
          _securePending: false,
          _newSessionPending: false,
          _controlReleased: true,
          secureConnecting: false,
          _SNICallback: null,
          servername: 'api.openai.com',
          alpnProtocol: false,
          authorized: true,
          authorizationError: null,
          encrypted: true,
          _events: [Object: null prototype] {
            close: [Array],
            end: [Function: onReadableStreamEnd],
            newListener: [Function: keylogNewListener],
            secure: [Function: onConnectSecure],
            session: [Function (anonymous)],
            free: [Function: onFree],
            timeout: [Function: onTimeout],
            agentRemove: [Function: onRemove],
            error: [Function: socketErrorListener],
            finish: [Function]
          },
          _eventsCount: 10,
          connecting: false,
          _hadError: false,
          _parent: null,
          _host: 'api.openai.com',
          _closeAfterHandlingError: false,
          _readableState: ReadableState {
            objectMode: false,
            highWaterMark: 16384,
            buffer: [BufferList],
            length: 0,
            pipes: [],
            flowing: true,
            ended: false,
            endEmitted: false,
            reading: true,
            constructed: true,
            sync: false,
            needReadable: true,
            emittedReadable: false,
            readableListening: false,
            resumeScheduled: false,
            errorEmitted: false,
            emitClose: false,
            autoDestroy: true,
            destroyed: false,
            errored: null,
            closed: false,
            closeEmitted: false,
            defaultEncoding: 'utf8',
            awaitDrainWriters: null,
            multiAwaitDrain: false,
            readingMore: false,
            dataEmitted: true,
            decoder: null,
            encoding: null,
            [Symbol(kPaused)]: false
          },
          _maxListeners: undefined,
          _writableState: WritableState {
            objectMode: false,
            highWaterMark: 16384,
            finalCalled: true,
            needDrain: false,
            ending: true,
            ended: true,
            finished: false,
            destroyed: false,
            decodeStrings: false,
            defaultEncoding: 'utf8',
            length: 0,
            writing: false,
            corked: 0,
            sync: false,
            bufferProcessing: false,
            onwrite: [Function: bound onwrite],
            writecb: null,
            writelen: 0,
            afterWriteTickInfo: null,
            buffered: [],
            bufferedIndex: 0,
            allBuffers: true,
            allNoop: true,
            pendingcb: 1,
            constructed: true,
            prefinished: false,
            errorEmitted: false,
            emitClose: false,
            autoDestroy: true,
            errored: null,
            closed: false,
            closeEmitted: false,
            [Symbol(kOnFinished)]: []
          },
          allowHalfOpen: false,
          _sockname: null,
          _pendingData: null,
          _pendingEncoding: '',
          server: undefined,
          _server: null,
          ssl: TLSWrap {
            _parent: [TCP],
            _parentWrap: undefined,
            _secureContext: [SecureContext],
            reading: true,
            onkeylog: [Function: onkeylog],
            onhandshakestart: {},
            onhandshakedone: [Function (anonymous)],
            onocspresponse: [Function: onocspresponse],
            onnewsession: [Function: onnewsessionclient],
            onerror: [Function: onerror],
            [Symbol(owner_symbol)]: [Circular *1]
          },
          _requestCert: true,
          _rejectUnauthorized: true,
          parser: null,
          _httpMessage: [Circular *2],
          [Symbol(res)]: TLSWrap {
            _parent: [TCP],
            _parentWrap: undefined,
            _secureContext: [SecureContext],
            reading: true,
            onkeylog: [Function: onkeylog],
            onhandshakestart: {},
            onhandshakedone: [Function (anonymous)],
            onocspresponse: [Function: onocspresponse],
            onnewsession: [Function: onnewsessionclient],
            onerror: [Function: onerror],
            [Symbol(owner_symbol)]: [Circular *1]
          },
          [Symbol(verified)]: true,
          [Symbol(pendingSession)]: null,
          [Symbol(async_id_symbol)]: 4879,
          [Symbol(kHandle)]: TLSWrap {
            _parent: [TCP],
            _parentWrap: undefined,
            _secureContext: [SecureContext],
            reading: true,
            onkeylog: [Function: onkeylog],
            onhandshakestart: {},
            onhandshakedone: [Function (anonymous)],
            onocspresponse: [Function: onocspresponse],
            onnewsession: [Function: onnewsessionclient],
            onerror: [Function: onerror],
            [Symbol(owner_symbol)]: [Circular *1]
          },
          [Symbol(lastWriteQueueSize)]: 0,
          [Symbol(timeout)]: null,
          [Symbol(kBuffer)]: null,
          [Symbol(kBufferCb)]: null,
          [Symbol(kBufferGen)]: null,
          [Symbol(kCapture)]: false,
          [Symbol(kSetNoDelay)]: false,
          [Symbol(kSetKeepAlive)]: true,
          [Symbol(kSetKeepAliveInitialDelay)]: 60,
          [Symbol(kBytesRead)]: 0,
          [Symbol(kBytesWritten)]: 0,
          [Symbol(connect-options)]: {
            rejectUnauthorized: true,
            ciphers: 'TLS_AES_256_GCM_SHA384:TLS_CHACHA20_POLY1305_SHA256:TLS_AES_128_GCM_SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA256:ECDHE-RSA-AES256-SHA384:DHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA256:HIGH:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!SRP:!CAMELLIA',
            checkServerIdentity: [Function: checkServerIdentity],
            minDHSize: 1024,
            session: [Buffer [Uint8Array]],
            maxRedirects: 21,
            maxBodyLength: 10485760,
            protocol: 'https:',
            path: null,
            method: 'POST',
            headers: [Object],
            agent: undefined,
            agents: [Object],
            auth: undefined,
            hostname: 'api.openai.com',
            port: 443,
            nativeProtocols: [Object],
            pathname: '/v1/chat/completions',
            _defaultAgent: [Agent],
            host: 'api.openai.com',
            noDelay: true,
            servername: 'api.openai.com',
            _agentKey: 'api.openai.com:443:::::::::::::::::::::',
            encoding: null,
            singleUse: true
          }
        },
        _consuming: false,
        _dumped: false,
        req: [Circular *2],
        responseUrl: 'https://api.openai.com/v1/chat/completions',
        redirects: [],
        [Symbol(kCapture)]: false,
        [Symbol(kHeaders)]: {
          date: 'Wed, 26 Jun 2024 21:41:32 GMT',
          'content-type': 'application/json',
          'content-length': '211',
          connection: 'close',
          'openai-organization': 'jungle-genuis-llc',
          'openai-processing-ms': '1740',
          'openai-version': '2020-10-01',
          'strict-transport-security': 'max-age=31536000; includeSubDomains',
          'x-ratelimit-limit-requests': '5000',
          'x-ratelimit-limit-tokens': '160000',
          'x-ratelimit-remaining-requests': '4994',
          'x-ratelimit-remaining-tokens': '156418',
          'x-ratelimit-reset-requests': '66ms',
          'x-ratelimit-reset-tokens': '1.342s',
          'x-request-id': '26aaa87d9eb8261ce239f850e96bf236',
          'cf-cache-status': 'DYNAMIC',
          'set-cookie': [
            '__cf_bm=WtuEb0w1wdxYCXSGvJVlfY0UN9AUYREDxE_ARbeysek-1719438092-1.0.1.1-PCOADWsjsP8M_QaPxlZiKY.V.4SRQLDJhDrbTD09OX8.aff_BXY_LumN9p0x377_t4t5ZflTwJkl30BHYsq3Dw; path=/; expires=Wed, 26-Jun-24 22:11:32 GMT; domain=.api.openai.com; HttpOnly; Secure; SameSite=None',
            '_cfuvid=DF.3uPC9WgyTRTvzmj0rBFzZijvXlHHNuu5GIMmDb2Y-1719438092270-0.0.1.1-604800000; path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None'
          ],
          server: 'cloudflare',
          'cf-ray': '89a05ca089e621d9-MIA',
          'alt-svc': 'h3=":443"; ma=86400'
        },
        [Symbol(kHeadersCount)]: 42,
        [Symbol(kTrailers)]: null,
        [Symbol(kTrailersCount)]: 0
      },
      aborted: false,
      timeoutCb: null,
      upgradeOrConnect: false,
      parser: null,
      maxHeadersCount: null,
      reusedSocket: false,
      host: 'api.openai.com',
      protocol: 'https:',
      _redirectable: Writable {
        _writableState: WritableState {
          objectMode: false,
          highWaterMark: 16384,
          finalCalled: false,
          needDrain: false,
          ending: false,
          ended: false,
          finished: false,
          destroyed: false,
          decodeStrings: true,
          defaultEncoding: 'utf8',
          length: 0,
          writing: false,
          corked: 0,
          sync: true,
          bufferProcessing: false,
          onwrite: [Function: bound onwrite],
          writecb: null,
          writelen: 0,
          afterWriteTickInfo: null,
          buffered: [],
          bufferedIndex: 0,
          allBuffers: true,
          allNoop: true,
          pendingcb: 0,
          constructed: true,
          prefinished: false,
          errorEmitted: false,
          emitClose: true,
          autoDestroy: true,
          errored: null,
          closed: false,
          closeEmitted: false,
          [Symbol(kOnFinished)]: []
        },
        _events: [Object: null prototype] {
          response: [Function: handleResponse],
          error: [Function: handleRequestError],
          socket: [Function: handleRequestSocket]
        },
        _eventsCount: 3,
        _maxListeners: undefined,
        _options: {
          maxRedirects: 21,
          maxBodyLength: 10485760,
          protocol: 'https:',
          path: '/v1/chat/completions',
          method: 'POST',
          headers: {
            Accept: 'application/json, text/plain, */*',
            'Content-Type': 'application/json',
            'User-Agent': 'OpenAI/NodeJS/3.3.0',
            Authorization: 'Bearer sk-redacted',
            'Content-Length': 2628
          },
          agent: undefined,
          agents: { http: undefined, https: undefined },
          auth: undefined,
          hostname: 'api.openai.com',
          port: null,
          nativeProtocols: { 'http:': [Object], 'https:': [Object] },
          pathname: '/v1/chat/completions'
        },
        _ended: true,
        _ending: true,
        _redirectCount: 0,
        _redirects: [],
        _requestBodyLength: 2628,
        _requestBodyBuffers: [],
        _onNativeResponse: [Function (anonymous)],
        _currentRequest: [Circular *2],
        _currentUrl: 'https://api.openai.com/v1/chat/completions',
        [Symbol(kCapture)]: false
      },
      [Symbol(kCapture)]: false,
      [Symbol(kBytesWritten)]: 0,
      [Symbol(kEndCalled)]: true,
      [Symbol(kNeedDrain)]: false,
      [Symbol(corked)]: 0,
      [Symbol(kOutHeaders)]: [Object: null prototype] {
        accept: [ 'Accept', 'application/json, text/plain, */*' ],
        'content-type': [ 'Content-Type', 'application/json' ],
        'user-agent': [ 'User-Agent', 'OpenAI/NodeJS/3.3.0' ],
        authorization: [
          'Authorization',
          'Bearer sk-redacted'
        ],
        'content-length': [ 'Content-Length', 2628 ],
        host: [ 'Host', 'api.openai.com' ]
      },
      [Symbol(errored)]: null,
      [Symbol(kUniqueHeaders)]: null
    },
    data: {
      error: {
        message: 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.',
        type: 'model_error',
        param: null,
        code: null
      }
    }
  },
  isAxiosError: true,
  toJSON: [Function: toJSON]
}

Node.js v18.13.0
<!-- gh-comment-id:2192707342 --> @JungleGenius commented on GitHub (Jun 26, 2024): @jehna Look like it was grinding away with 65,000+ tokens. However, It died. ``` url: 'https://api.openai.com/v1/chat/completions' }, request: <ref *2> ClientRequest { _events: [Object: null prototype] { abort: [Function (anonymous)], aborted: [Function (anonymous)], connect: [Function (anonymous)], error: [Function (anonymous)], socket: [Function (anonymous)], timeout: [Function (anonymous)], finish: [Function: requestOnFinish] }, _eventsCount: 7, _maxListeners: undefined, outputData: [], outputSize: 0, writable: true, destroyed: false, _last: true, chunkedEncoding: false, shouldKeepAlive: false, maxRequestsOnConnectionReached: false, _defaultKeepAlive: true, useChunkedEncodingByDefault: true, sendDate: false, _removedConnection: false, _removedContLen: false, _removedTE: false, strictContentLength: false, _contentLength: 2628, _hasBody: true, _trailer: '', finished: true, _headerSent: true, _closed: false, socket: <ref *1> TLSSocket { _tlsOptions: { allowHalfOpen: undefined, pipe: false, secureContext: SecureContext { context: SecureContext {} }, isServer: false, requestCert: true, rejectUnauthorized: true, session: Buffer(1737) [Uint8Array] [ 48, 130, 6, 197, 2, 1, 1, 2, 2, 3, 4, 4, 2, 19, 2, 4, 32, 70, 145, 89, 53, 171, 164, 45, 32, 245, 145, 80, 250, 97, 94, 143, 83, 40, 17, 148, 98, 155, 251, 177, 114, 211, 136, 136, 82, 40, 98, 220, 92, 4, 48, 158, 151, 61, 236, 1, 160, 77, 213, 156, 112, 149, 89, 208, 110, 114, 82, 102, 25, 38, 49, 191, 131, 30, 167, 147, 13, 4, 112, 95, 72, 6, 180, 117, 193, 174, 160, 144, 218, 242, 62, 77, 50, 108, 72, 58, 12, 48, 41, 161, ... 1637 more items ], ALPNProtocols: undefined, requestOCSP: undefined, enableTrace: undefined, pskCallback: undefined, highWaterMark: undefined, onread: undefined, signal: undefined }, _secureEstablished: true, _securePending: false, _newSessionPending: false, _controlReleased: true, secureConnecting: false, _SNICallback: null, servername: 'api.openai.com', alpnProtocol: false, authorized: true, authorizationError: null, encrypted: true, _events: [Object: null prototype] { close: [ [Function: onSocketCloseDestroySSL], [Function], [Function: onClose], [Function: socketCloseListener] ], end: [Function: onReadableStreamEnd], newListener: [Function: keylogNewListener], secure: [Function: onConnectSecure], session: [Function (anonymous)], free: [Function: onFree], timeout: [Function: onTimeout], agentRemove: [Function: onRemove], error: [Function: socketErrorListener], finish: [Function: bound onceWrapper] { listener: [Function: destroy] } }, _eventsCount: 10, connecting: false, _hadError: false, _parent: null, _host: 'api.openai.com', _closeAfterHandlingError: false, _readableState: ReadableState { objectMode: false, highWaterMark: 16384, buffer: BufferList { head: null, tail: null, length: 0 }, length: 0, pipes: [], flowing: true, ended: false, endEmitted: false, reading: true, constructed: true, sync: false, needReadable: true, emittedReadable: false, readableListening: false, resumeScheduled: false, errorEmitted: false, emitClose: false, autoDestroy: true, destroyed: false, errored: null, closed: false, closeEmitted: false, defaultEncoding: 'utf8', awaitDrainWriters: null, multiAwaitDrain: false, readingMore: false, dataEmitted: true, decoder: null, encoding: null, [Symbol(kPaused)]: false }, _maxListeners: undefined, _writableState: WritableState { objectMode: false, highWaterMark: 16384, finalCalled: true, needDrain: false, ending: true, ended: true, finished: false, destroyed: false, decodeStrings: false, defaultEncoding: 'utf8', length: 0, writing: false, corked: 0, sync: false, bufferProcessing: false, onwrite: [Function: bound onwrite], writecb: null, writelen: 0, afterWriteTickInfo: null, buffered: [], bufferedIndex: 0, allBuffers: true, allNoop: true, pendingcb: 1, constructed: true, prefinished: false, errorEmitted: false, emitClose: false, autoDestroy: true, errored: null, closed: false, closeEmitted: false, [Symbol(kOnFinished)]: [] }, allowHalfOpen: false, _sockname: null, _pendingData: null, _pendingEncoding: '', server: undefined, _server: null, ssl: TLSWrap { _parent: TCP { reading: [Getter/Setter], onconnection: null, [Symbol(owner_symbol)]: [Circular *1] }, _parentWrap: undefined, _secureContext: SecureContext { context: SecureContext {} }, reading: true, onkeylog: [Function: onkeylog], onhandshakestart: {}, onhandshakedone: [Function (anonymous)], onocspresponse: [Function: onocspresponse], onnewsession: [Function: onnewsessionclient], onerror: [Function: onerror], [Symbol(owner_symbol)]: [Circular *1] }, _requestCert: true, _rejectUnauthorized: true, parser: null, _httpMessage: [Circular *2], [Symbol(res)]: TLSWrap { _parent: TCP { reading: [Getter/Setter], onconnection: null, [Symbol(owner_symbol)]: [Circular *1] }, _parentWrap: undefined, _secureContext: SecureContext { context: SecureContext {} }, reading: true, onkeylog: [Function: onkeylog], onhandshakestart: {}, onhandshakedone: [Function (anonymous)], onocspresponse: [Function: onocspresponse], onnewsession: [Function: onnewsessionclient], onerror: [Function: onerror], [Symbol(owner_symbol)]: [Circular *1] }, [Symbol(verified)]: true, [Symbol(pendingSession)]: null, [Symbol(async_id_symbol)]: 4879, [Symbol(kHandle)]: TLSWrap { _parent: TCP { reading: [Getter/Setter], onconnection: null, [Symbol(owner_symbol)]: [Circular *1] }, _parentWrap: undefined, _secureContext: SecureContext { context: SecureContext {} }, reading: true, onkeylog: [Function: onkeylog], onhandshakestart: {}, onhandshakedone: [Function (anonymous)], onocspresponse: [Function: onocspresponse], onnewsession: [Function: onnewsessionclient], onerror: [Function: onerror], [Symbol(owner_symbol)]: [Circular *1] }, [Symbol(lastWriteQueueSize)]: 0, [Symbol(timeout)]: null, [Symbol(kBuffer)]: null, [Symbol(kBufferCb)]: null, [Symbol(kBufferGen)]: null, [Symbol(kCapture)]: false, [Symbol(kSetNoDelay)]: false, [Symbol(kSetKeepAlive)]: true, [Symbol(kSetKeepAliveInitialDelay)]: 60, [Symbol(kBytesRead)]: 0, [Symbol(kBytesWritten)]: 0, [Symbol(connect-options)]: { rejectUnauthorized: true, ciphers: 'TLS_AES_256_GCM_SHA384:TLS_CHACHA20_POLY1305_SHA256:TLS_AES_128_GCM_SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA256:ECDHE-RSA-AES256-SHA384:DHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA256:HIGH:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!SRP:!CAMELLIA', checkServerIdentity: [Function: checkServerIdentity], minDHSize: 1024, session: Buffer(1737) [Uint8Array] [ 48, 130, 6, 197, 2, 1, 1, 2, 2, 3, 4, 4, 2, 19, 2, 4, 32, 70, 145, 89, 53, 171, 164, 45, 32, 245, 145, 80, 250, 97, 94, 143, 83, 40, 17, 148, 98, 155, 251, 177, 114, 211, 136, 136, 82, 40, 98, 220, 92, 4, 48, 158, 151, 61, 236, 1, 160, 77, 213, 156, 112, 149, 89, 208, 110, 114, 82, 102, 25, 38, 49, 191, 131, 30, 167, 147, 13, 4, 112, 95, 72, 6, 180, 117, 193, 174, 160, 144, 218, 242, 62, 77, 50, 108, 72, 58, 12, 48, 41, 161, ... 1637 more items ], maxRedirects: 21, maxBodyLength: 10485760, protocol: 'https:', path: null, method: 'POST', headers: { Accept: 'application/json, text/plain, */*', 'Content-Type': 'application/json', 'User-Agent': 'OpenAI/NodeJS/3.3.0', Authorization: 'Bearer sk-redacted', 'Content-Length': 2628 }, agent: undefined, agents: { http: undefined, https: undefined }, auth: undefined, hostname: 'api.openai.com', port: 443, nativeProtocols: { 'http:': [Object], 'https:': [Object] }, pathname: '/v1/chat/completions', _defaultAgent: Agent { _events: [Object: null prototype], _eventsCount: 2, _maxListeners: undefined, defaultPort: 443, protocol: 'https:', options: [Object: null prototype], requests: [Object: null prototype] {}, sockets: [Object: null prototype], freeSockets: [Object: null prototype] {}, keepAliveMsecs: 1000, keepAlive: false, maxSockets: Infinity, maxFreeSockets: 256, scheduling: 'lifo', maxTotalSockets: Infinity, totalSocketCount: 4, maxCachedSessions: 100, _sessionCache: [Object], [Symbol(kCapture)]: false }, host: 'api.openai.com', noDelay: true, servername: 'api.openai.com', _agentKey: 'api.openai.com:443:::::::::::::::::::::', encoding: null, singleUse: true } }, _header: 'POST /v1/chat/completions HTTP/1.1\r\n' + 'Accept: application/json, text/plain, */*\r\n' + 'Content-Type: application/json\r\n' + 'User-Agent: OpenAI/NodeJS/3.3.0\r\n' + 'Authorization: Bearer sk-redacted\r\n' + 'Content-Length: 2628\r\n' + 'Host: api.openai.com\r\n' + 'Connection: close\r\n' + '\r\n', _keepAliveTimeout: 0, _onPendingData: [Function: nop], agent: Agent { _events: [Object: null prototype] { free: [Function (anonymous)], newListener: [Function: maybeEnableKeylog] }, _eventsCount: 2, _maxListeners: undefined, defaultPort: 443, protocol: 'https:', options: [Object: null prototype] { noDelay: true, path: null }, requests: [Object: null prototype] {}, sockets: [Object: null prototype] { 'api.openai.com:443:::::::::::::::::::::': [ [TLSSocket], [TLSSocket], [TLSSocket], [TLSSocket] ] }, freeSockets: [Object: null prototype] {}, keepAliveMsecs: 1000, keepAlive: false, maxSockets: Infinity, maxFreeSockets: 256, scheduling: 'lifo', maxTotalSockets: Infinity, totalSocketCount: 4, maxCachedSessions: 100, _sessionCache: { map: { 'api.openai.com:443:::::::::::::::::::::': [Buffer [Uint8Array]] }, list: [ 'api.openai.com:443:::::::::::::::::::::' ] }, [Symbol(kCapture)]: false }, socketPath: undefined, method: 'POST', maxHeaderSize: undefined, insecureHTTPParser: undefined, path: '/v1/chat/completions', _ended: true, res: IncomingMessage { _readableState: ReadableState { objectMode: false, highWaterMark: 16384, buffer: BufferList { head: null, tail: null, length: 0 }, length: 0, pipes: [], flowing: true, ended: true, endEmitted: true, reading: false, constructed: true, sync: true, needReadable: false, emittedReadable: false, readableListening: false, resumeScheduled: false, errorEmitted: false, emitClose: true, autoDestroy: true, destroyed: true, errored: null, closed: true, closeEmitted: true, defaultEncoding: 'utf8', awaitDrainWriters: null, multiAwaitDrain: false, readingMore: true, dataEmitted: true, decoder: null, encoding: null, [Symbol(kPaused)]: false }, _events: [Object: null prototype] { end: [ [Function: responseOnEnd], [Function: handleStreamEnd] ], data: [Function: handleStreamData], aborted: [Function: handlerStreamAborted], error: [Function: handleStreamError] }, _eventsCount: 4, _maxListeners: undefined, socket: <ref *1> TLSSocket { _tlsOptions: { allowHalfOpen: undefined, pipe: false, secureContext: [SecureContext], isServer: false, requestCert: true, rejectUnauthorized: true, session: [Buffer [Uint8Array]], ALPNProtocols: undefined, requestOCSP: undefined, enableTrace: undefined, pskCallback: undefined, highWaterMark: undefined, onread: undefined, signal: undefined }, _secureEstablished: true, _securePending: false, _newSessionPending: false, _controlReleased: true, secureConnecting: false, _SNICallback: null, servername: 'api.openai.com', alpnProtocol: false, authorized: true, authorizationError: null, encrypted: true, _events: [Object: null prototype] { close: [Array], end: [Function: onReadableStreamEnd], newListener: [Function: keylogNewListener], secure: [Function: onConnectSecure], session: [Function (anonymous)], free: [Function: onFree], timeout: [Function: onTimeout], agentRemove: [Function: onRemove], error: [Function: socketErrorListener], finish: [Function] }, _eventsCount: 10, connecting: false, _hadError: false, _parent: null, _host: 'api.openai.com', _closeAfterHandlingError: false, _readableState: ReadableState { objectMode: false, highWaterMark: 16384, buffer: [BufferList], length: 0, pipes: [], flowing: true, ended: false, endEmitted: false, reading: true, constructed: true, sync: false, needReadable: true, emittedReadable: false, readableListening: false, resumeScheduled: false, errorEmitted: false, emitClose: false, autoDestroy: true, destroyed: false, errored: null, closed: false, closeEmitted: false, defaultEncoding: 'utf8', awaitDrainWriters: null, multiAwaitDrain: false, readingMore: false, dataEmitted: true, decoder: null, encoding: null, [Symbol(kPaused)]: false }, _maxListeners: undefined, _writableState: WritableState { objectMode: false, highWaterMark: 16384, finalCalled: true, needDrain: false, ending: true, ended: true, finished: false, destroyed: false, decodeStrings: false, defaultEncoding: 'utf8', length: 0, writing: false, corked: 0, sync: false, bufferProcessing: false, onwrite: [Function: bound onwrite], writecb: null, writelen: 0, afterWriteTickInfo: null, buffered: [], bufferedIndex: 0, allBuffers: true, allNoop: true, pendingcb: 1, constructed: true, prefinished: false, errorEmitted: false, emitClose: false, autoDestroy: true, errored: null, closed: false, closeEmitted: false, [Symbol(kOnFinished)]: [] }, allowHalfOpen: false, _sockname: null, _pendingData: null, _pendingEncoding: '', server: undefined, _server: null, ssl: TLSWrap { _parent: [TCP], _parentWrap: undefined, _secureContext: [SecureContext], reading: true, onkeylog: [Function: onkeylog], onhandshakestart: {}, onhandshakedone: [Function (anonymous)], onocspresponse: [Function: onocspresponse], onnewsession: [Function: onnewsessionclient], onerror: [Function: onerror], [Symbol(owner_symbol)]: [Circular *1] }, _requestCert: true, _rejectUnauthorized: true, parser: null, _httpMessage: [Circular *2], [Symbol(res)]: TLSWrap { _parent: [TCP], _parentWrap: undefined, _secureContext: [SecureContext], reading: true, onkeylog: [Function: onkeylog], onhandshakestart: {}, onhandshakedone: [Function (anonymous)], onocspresponse: [Function: onocspresponse], onnewsession: [Function: onnewsessionclient], onerror: [Function: onerror], [Symbol(owner_symbol)]: [Circular *1] }, [Symbol(verified)]: true, [Symbol(pendingSession)]: null, [Symbol(async_id_symbol)]: 4879, [Symbol(kHandle)]: TLSWrap { _parent: [TCP], _parentWrap: undefined, _secureContext: [SecureContext], reading: true, onkeylog: [Function: onkeylog], onhandshakestart: {}, onhandshakedone: [Function (anonymous)], onocspresponse: [Function: onocspresponse], onnewsession: [Function: onnewsessionclient], onerror: [Function: onerror], [Symbol(owner_symbol)]: [Circular *1] }, [Symbol(lastWriteQueueSize)]: 0, [Symbol(timeout)]: null, [Symbol(kBuffer)]: null, [Symbol(kBufferCb)]: null, [Symbol(kBufferGen)]: null, [Symbol(kCapture)]: false, [Symbol(kSetNoDelay)]: false, [Symbol(kSetKeepAlive)]: true, [Symbol(kSetKeepAliveInitialDelay)]: 60, [Symbol(kBytesRead)]: 0, [Symbol(kBytesWritten)]: 0, [Symbol(connect-options)]: { rejectUnauthorized: true, ciphers: 'TLS_AES_256_GCM_SHA384:TLS_CHACHA20_POLY1305_SHA256:TLS_AES_128_GCM_SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA256:ECDHE-RSA-AES256-SHA384:DHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA256:HIGH:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!SRP:!CAMELLIA', checkServerIdentity: [Function: checkServerIdentity], minDHSize: 1024, session: [Buffer [Uint8Array]], maxRedirects: 21, maxBodyLength: 10485760, protocol: 'https:', path: null, method: 'POST', headers: [Object], agent: undefined, agents: [Object], auth: undefined, hostname: 'api.openai.com', port: 443, nativeProtocols: [Object], pathname: '/v1/chat/completions', _defaultAgent: [Agent], host: 'api.openai.com', noDelay: true, servername: 'api.openai.com', _agentKey: 'api.openai.com:443:::::::::::::::::::::', encoding: null, singleUse: true } }, httpVersionMajor: 1, httpVersionMinor: 1, httpVersion: '1.1', complete: true, rawHeaders: [ 'Date', 'Wed, 26 Jun 2024 21:41:32 GMT', 'Content-Type', 'application/json', 'Content-Length', '211', 'Connection', 'close', 'openai-organization', 'jungle-genuis-llc', 'openai-processing-ms', '1740', 'openai-version', '2020-10-01', 'strict-transport-security', 'max-age=31536000; includeSubDomains', 'x-ratelimit-limit-requests', '5000', 'x-ratelimit-limit-tokens', '160000', 'x-ratelimit-remaining-requests', '4994', 'x-ratelimit-remaining-tokens', '156418', 'x-ratelimit-reset-requests', '66ms', 'x-ratelimit-reset-tokens', '1.342s', 'x-request-id', '26aaa87d9eb8261ce239f850e96bf236', 'CF-Cache-Status', 'DYNAMIC', 'Set-Cookie', '__cf_bm=WtuEb0w1wdxYCXSGvJVlfY0UN9AUYREDxE_ARbeysek-1719438092-1.0.1.1-PCOADWsjsP8M_QaPxlZiKY.V.4SRQLDJhDrbTD09OX8.aff_BXY_LumN9p0x377_t4t5ZflTwJkl30BHYsq3Dw; path=/; expires=Wed, 26-Jun-24 22:11:32 GMT; domain=.api.openai.com; HttpOnly; Secure; SameSite=None', 'Set-Cookie', '_cfuvid=DF.3uPC9WgyTRTvzmj0rBFzZijvXlHHNuu5GIMmDb2Y-1719438092270-0.0.1.1-604800000; path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None', 'Server', 'cloudflare', 'CF-RAY', '89a05ca089e621d9-MIA', 'alt-svc', 'h3=":443"; ma=86400' ], rawTrailers: [], aborted: false, upgrade: false, url: '', method: null, statusCode: 500, statusMessage: 'Internal Server Error', client: <ref *1> TLSSocket { _tlsOptions: { allowHalfOpen: undefined, pipe: false, secureContext: [SecureContext], isServer: false, requestCert: true, rejectUnauthorized: true, session: [Buffer [Uint8Array]], ALPNProtocols: undefined, requestOCSP: undefined, enableTrace: undefined, pskCallback: undefined, highWaterMark: undefined, onread: undefined, signal: undefined }, _secureEstablished: true, _securePending: false, _newSessionPending: false, _controlReleased: true, secureConnecting: false, _SNICallback: null, servername: 'api.openai.com', alpnProtocol: false, authorized: true, authorizationError: null, encrypted: true, _events: [Object: null prototype] { close: [Array], end: [Function: onReadableStreamEnd], newListener: [Function: keylogNewListener], secure: [Function: onConnectSecure], session: [Function (anonymous)], free: [Function: onFree], timeout: [Function: onTimeout], agentRemove: [Function: onRemove], error: [Function: socketErrorListener], finish: [Function] }, _eventsCount: 10, connecting: false, _hadError: false, _parent: null, _host: 'api.openai.com', _closeAfterHandlingError: false, _readableState: ReadableState { objectMode: false, highWaterMark: 16384, buffer: [BufferList], length: 0, pipes: [], flowing: true, ended: false, endEmitted: false, reading: true, constructed: true, sync: false, needReadable: true, emittedReadable: false, readableListening: false, resumeScheduled: false, errorEmitted: false, emitClose: false, autoDestroy: true, destroyed: false, errored: null, closed: false, closeEmitted: false, defaultEncoding: 'utf8', awaitDrainWriters: null, multiAwaitDrain: false, readingMore: false, dataEmitted: true, decoder: null, encoding: null, [Symbol(kPaused)]: false }, _maxListeners: undefined, _writableState: WritableState { objectMode: false, highWaterMark: 16384, finalCalled: true, needDrain: false, ending: true, ended: true, finished: false, destroyed: false, decodeStrings: false, defaultEncoding: 'utf8', length: 0, writing: false, corked: 0, sync: false, bufferProcessing: false, onwrite: [Function: bound onwrite], writecb: null, writelen: 0, afterWriteTickInfo: null, buffered: [], bufferedIndex: 0, allBuffers: true, allNoop: true, pendingcb: 1, constructed: true, prefinished: false, errorEmitted: false, emitClose: false, autoDestroy: true, errored: null, closed: false, closeEmitted: false, [Symbol(kOnFinished)]: [] }, allowHalfOpen: false, _sockname: null, _pendingData: null, _pendingEncoding: '', server: undefined, _server: null, ssl: TLSWrap { _parent: [TCP], _parentWrap: undefined, _secureContext: [SecureContext], reading: true, onkeylog: [Function: onkeylog], onhandshakestart: {}, onhandshakedone: [Function (anonymous)], onocspresponse: [Function: onocspresponse], onnewsession: [Function: onnewsessionclient], onerror: [Function: onerror], [Symbol(owner_symbol)]: [Circular *1] }, _requestCert: true, _rejectUnauthorized: true, parser: null, _httpMessage: [Circular *2], [Symbol(res)]: TLSWrap { _parent: [TCP], _parentWrap: undefined, _secureContext: [SecureContext], reading: true, onkeylog: [Function: onkeylog], onhandshakestart: {}, onhandshakedone: [Function (anonymous)], onocspresponse: [Function: onocspresponse], onnewsession: [Function: onnewsessionclient], onerror: [Function: onerror], [Symbol(owner_symbol)]: [Circular *1] }, [Symbol(verified)]: true, [Symbol(pendingSession)]: null, [Symbol(async_id_symbol)]: 4879, [Symbol(kHandle)]: TLSWrap { _parent: [TCP], _parentWrap: undefined, _secureContext: [SecureContext], reading: true, onkeylog: [Function: onkeylog], onhandshakestart: {}, onhandshakedone: [Function (anonymous)], onocspresponse: [Function: onocspresponse], onnewsession: [Function: onnewsessionclient], onerror: [Function: onerror], [Symbol(owner_symbol)]: [Circular *1] }, [Symbol(lastWriteQueueSize)]: 0, [Symbol(timeout)]: null, [Symbol(kBuffer)]: null, [Symbol(kBufferCb)]: null, [Symbol(kBufferGen)]: null, [Symbol(kCapture)]: false, [Symbol(kSetNoDelay)]: false, [Symbol(kSetKeepAlive)]: true, [Symbol(kSetKeepAliveInitialDelay)]: 60, [Symbol(kBytesRead)]: 0, [Symbol(kBytesWritten)]: 0, [Symbol(connect-options)]: { rejectUnauthorized: true, ciphers: 'TLS_AES_256_GCM_SHA384:TLS_CHACHA20_POLY1305_SHA256:TLS_AES_128_GCM_SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA256:ECDHE-RSA-AES256-SHA384:DHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA256:HIGH:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!SRP:!CAMELLIA', checkServerIdentity: [Function: checkServerIdentity], minDHSize: 1024, session: [Buffer [Uint8Array]], maxRedirects: 21, maxBodyLength: 10485760, protocol: 'https:', path: null, method: 'POST', headers: [Object], agent: undefined, agents: [Object], auth: undefined, hostname: 'api.openai.com', port: 443, nativeProtocols: [Object], pathname: '/v1/chat/completions', _defaultAgent: [Agent], host: 'api.openai.com', noDelay: true, servername: 'api.openai.com', _agentKey: 'api.openai.com:443:::::::::::::::::::::', encoding: null, singleUse: true } }, _consuming: false, _dumped: false, req: [Circular *2], responseUrl: 'https://api.openai.com/v1/chat/completions', redirects: [], [Symbol(kCapture)]: false, [Symbol(kHeaders)]: { date: 'Wed, 26 Jun 2024 21:41:32 GMT', 'content-type': 'application/json', 'content-length': '211', connection: 'close', 'openai-organization': 'jungle-genuis-llc', 'openai-processing-ms': '1740', 'openai-version': '2020-10-01', 'strict-transport-security': 'max-age=31536000; includeSubDomains', 'x-ratelimit-limit-requests': '5000', 'x-ratelimit-limit-tokens': '160000', 'x-ratelimit-remaining-requests': '4994', 'x-ratelimit-remaining-tokens': '156418', 'x-ratelimit-reset-requests': '66ms', 'x-ratelimit-reset-tokens': '1.342s', 'x-request-id': '26aaa87d9eb8261ce239f850e96bf236', 'cf-cache-status': 'DYNAMIC', 'set-cookie': [ '__cf_bm=WtuEb0w1wdxYCXSGvJVlfY0UN9AUYREDxE_ARbeysek-1719438092-1.0.1.1-PCOADWsjsP8M_QaPxlZiKY.V.4SRQLDJhDrbTD09OX8.aff_BXY_LumN9p0x377_t4t5ZflTwJkl30BHYsq3Dw; path=/; expires=Wed, 26-Jun-24 22:11:32 GMT; domain=.api.openai.com; HttpOnly; Secure; SameSite=None', '_cfuvid=DF.3uPC9WgyTRTvzmj0rBFzZijvXlHHNuu5GIMmDb2Y-1719438092270-0.0.1.1-604800000; path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None' ], server: 'cloudflare', 'cf-ray': '89a05ca089e621d9-MIA', 'alt-svc': 'h3=":443"; ma=86400' }, [Symbol(kHeadersCount)]: 42, [Symbol(kTrailers)]: null, [Symbol(kTrailersCount)]: 0 }, aborted: false, timeoutCb: null, upgradeOrConnect: false, parser: null, maxHeadersCount: null, reusedSocket: false, host: 'api.openai.com', protocol: 'https:', _redirectable: Writable { _writableState: WritableState { objectMode: false, highWaterMark: 16384, finalCalled: false, needDrain: false, ending: false, ended: false, finished: false, destroyed: false, decodeStrings: true, defaultEncoding: 'utf8', length: 0, writing: false, corked: 0, sync: true, bufferProcessing: false, onwrite: [Function: bound onwrite], writecb: null, writelen: 0, afterWriteTickInfo: null, buffered: [], bufferedIndex: 0, allBuffers: true, allNoop: true, pendingcb: 0, constructed: true, prefinished: false, errorEmitted: false, emitClose: true, autoDestroy: true, errored: null, closed: false, closeEmitted: false, [Symbol(kOnFinished)]: [] }, _events: [Object: null prototype] { response: [Function: handleResponse], error: [Function: handleRequestError], socket: [Function: handleRequestSocket] }, _eventsCount: 3, _maxListeners: undefined, _options: { maxRedirects: 21, maxBodyLength: 10485760, protocol: 'https:', path: '/v1/chat/completions', method: 'POST', headers: { Accept: 'application/json, text/plain, */*', 'Content-Type': 'application/json', 'User-Agent': 'OpenAI/NodeJS/3.3.0', Authorization: 'Bearer sk-redacted', 'Content-Length': 2628 }, agent: undefined, agents: { http: undefined, https: undefined }, auth: undefined, hostname: 'api.openai.com', port: null, nativeProtocols: { 'http:': [Object], 'https:': [Object] }, pathname: '/v1/chat/completions' }, _ended: true, _ending: true, _redirectCount: 0, _redirects: [], _requestBodyLength: 2628, _requestBodyBuffers: [], _onNativeResponse: [Function (anonymous)], _currentRequest: [Circular *2], _currentUrl: 'https://api.openai.com/v1/chat/completions', [Symbol(kCapture)]: false }, [Symbol(kCapture)]: false, [Symbol(kBytesWritten)]: 0, [Symbol(kEndCalled)]: true, [Symbol(kNeedDrain)]: false, [Symbol(corked)]: 0, [Symbol(kOutHeaders)]: [Object: null prototype] { accept: [ 'Accept', 'application/json, text/plain, */*' ], 'content-type': [ 'Content-Type', 'application/json' ], 'user-agent': [ 'User-Agent', 'OpenAI/NodeJS/3.3.0' ], authorization: [ 'Authorization', 'Bearer sk-redacted' ], 'content-length': [ 'Content-Length', 2628 ], host: [ 'Host', 'api.openai.com' ] }, [Symbol(errored)]: null, [Symbol(kUniqueHeaders)]: null }, data: { error: { message: 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', type: 'model_error', param: null, code: null } } }, isAxiosError: true, toJSON: [Function: toJSON] } Node.js v18.13.0 ```
Author
Owner

@JungleGenius commented on GitHub (Jul 2, 2024):

Any comments?

<!-- gh-comment-id:2204174251 --> @JungleGenius commented on GitHub (Jul 2, 2024): Any comments?
Author
Owner

@jehna commented on GitHub (Jul 2, 2024):

Sounds very much the same issue as #12

A fix should be straightforward, just have zero time to work on this project at the moment. Would appreciate a PR (see latest comment at #12 for potential fix)

<!-- gh-comment-id:2204276191 --> @jehna commented on GitHub (Jul 2, 2024): Sounds very much the same issue as #12 A fix should be straightforward, just have zero time to work on this project at the moment. Would appreciate a PR (see latest comment at #12 for potential fix)
Author
Owner

@0xdevalias commented on GitHub (Jul 3, 2024):

From https://github.com/jehna/humanify/issues/18#issuecomment-2192707342

      error: {
        message: 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.',
        type: 'model_error',
        param: null,
        code: null
      }

Googling for that error message led me to this thread, which has at least one potential lead/solution to look into (though seems it can also be caused by other things too, based on a later comment in the thread):

  • https://community.openai.com/t/error-the-model-produced-invalid-content/747511
    • When I switched to gpt-4o I started getting these errors that I never got using previous models
      I can’t confirm but this seems to happen when the model tries to call a function after exchanging some messages. Sometime I try again and it works, sometimes it gets stuck

    • same situation. have a tool call in messages. maybe that is triggering the error.

    • I think I got it fixed. In case the issue was due to a bit of a mix up in my code. I was mixing the new tools mechanism with the old function calling mechanism and somehow that was passing through in the older models. The new gpt-4o model seems to be more strict about that issue but after fixing following the new tools API correctly the issues seems to be gone, at least for now.

    • I’ve tried updating to the latest API version and changing the way I call tools and tools_choice, but I’m still getting the same error. Have you had a similar error again? Or have you already solved it with this? Because if you have had it again it would give me the clue that OpenAI has not yet been able to solve it on their side, but if you have not had it again, it would mean that I am still writing something wrong in my code.

    • Once i fixed the way I was using tools I never got this error message again.
      Make sure you’re passing all the right ids, function names and parameters in the right orders and you should be good to go

<!-- gh-comment-id:2204798706 --> @0xdevalias commented on GitHub (Jul 3, 2024): From https://github.com/jehna/humanify/issues/18#issuecomment-2192707342 > ``` > error: { > message: 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', > type: 'model_error', > param: null, > code: null > } > ``` Googling for that error message led me to this thread, which has at least one potential lead/solution to look into (though seems it can also be caused by other things too, based on a later comment in the thread): - https://community.openai.com/t/error-the-model-produced-invalid-content/747511 - > When I switched to gpt-4o I started getting these errors that I never got using previous models > I can’t confirm but this seems to happen when the model tries to call a function after exchanging some messages. Sometime I try again and it works, sometimes it gets stuck - > same situation. have a tool call in messages. maybe that is triggering the error. - > I think I got it fixed. In case the issue was due to a bit of a mix up in my code. I was mixing the new tools mechanism with the old function calling mechanism and somehow that was passing through in the older models. The new gpt-4o model seems to be more strict about that issue but after fixing following the new tools API correctly the issues seems to be gone, at least for now. - > I’ve tried updating to the latest API version and changing the way I call tools and tools_choice, but I’m still getting the same error. Have you had a similar error again? Or have you already solved it with this? Because if you have had it again it would give me the clue that OpenAI has not yet been able to solve it on their side, but if you have not had it again, it would mean that I am still writing something wrong in my code. - > Once i fixed the way I was using tools I never got this error message again. Make sure you’re passing all the right ids, function names and parameters in the right orders and you should be good to go
Author
Owner

@0xdevalias commented on GitHub (Jul 3, 2024):

I also noticed that this project is using the node openai 3.3.0 package, whereas it's up to 4.52.3 now (that might not make a difference at all, but you never know):

github.com/jehna/humanify@002fd68d10/package.json (L22)

The changelog entries only seem to start from 4.2.0:

There are a few bugfix entries related to handling errors while streaming.. I wonder if that might be helpful?

Also some related to tools/functions, eg.

And new model:


  • https://github.com/openai/openai-node#automated-function-calls
    • We provide the openai.beta.chat.completions.runTools({…}) convenience helper for using function tool calls with the /chat/completions endpoint which automatically call the JavaScript functions you provide and sends their results back to the /chat/completions endpoint, looping as long as the model requests tool calls.

    • If you pass a parse function, it will automatically parse the arguments for you and returns any parsing errors to the model to attempt auto-recovery. Otherwise, the args will be passed to the function you provide as a string.

    • If you pass tool_choice: {function: {name: …}} instead of auto, it returns immediately after calling that function (and only loops to auto-recover parsing errors).

    • Note that runFunctions was previously available as well, but has been deprecated in favor of runTools.


We can see that the openai stuff is initially called here, which generates a 'plugin' that is then applied to the code of each of the files extracted with webcrack:

github.com/jehna/humanify@002fd68d10/src/index.ts (L56-L71)

With the main logic being implemented here, and the actual SDK call within codeToVariableRenames:

github.com/jehna/humanify@002fd68d10/src/openai/openai.ts (L14-L82)

We can also see that it's using the functions config, which is deprecated now, and replaced by tools:


Edit: Captured the above notes in a new more specific issue:

<!-- gh-comment-id:2204857432 --> @0xdevalias commented on GitHub (Jul 3, 2024): I also noticed that this project is using the node `openai` `3.3.0` package, whereas it's up to `4.52.3` now (that might not make a difference at all, but you never know): - https://www.npmjs.com/package/openai - https://github.com/openai/openai-node/releases https://github.com/jehna/humanify/blob/002fd68d10f7ff13b80b7c305ad09c9c7f63bc47/package.json#L22 The changelog entries only seem to start from `4.2.0`: - https://github.com/openai/openai-node/blob/master/CHANGELOG.md#420-2023-08-23 There are a few bugfix entries related to handling errors while streaming.. I wonder if that might be helpful? - https://github.com/openai/openai-node/blob/master/CHANGELOG.md#4122-2023-10-16 Also some related to tools/functions, eg. - https://github.com/openai/openai-node/blob/master/CHANGELOG.md#4210-2023-12-11 - https://github.com/openai/openai-node/pull/562 - https://github.com/openai/openai-node/blob/master/CHANGELOG.md#4221-2023-12-15 - https://github.com/openai/openai-node/issues/570 And new model: - https://github.com/openai/openai-node/blob/master/CHANGELOG.md#4460-2024-05-13 - https://github.com/openai/openai-node/issues/841 --- - https://github.com/openai/openai-node#automated-function-calls - > We provide the `openai.beta.chat.completions.runTools({…})` convenience helper for using function tool calls with the `/chat/completions` endpoint which automatically call the JavaScript functions you provide and sends their results back to the `/chat/completions` endpoint, looping as long as the model requests tool calls. - > If you pass a `parse` function, it will automatically parse the `arguments` for you and returns any parsing errors to the model to attempt auto-recovery. Otherwise, the args will be passed to the function you provide as a string. - > If you pass `tool_choice: {function: {name: …}}` instead of `auto`, it returns immediately after calling that function (and only loops to auto-recover parsing errors). - > Note that `runFunctions` was previously available as well, but has been deprecated in favor of `runTools`. --- We can see that the openai stuff is initially called here, which generates a 'plugin' that is then applied to the code of each of the files extracted with `webcrack`: https://github.com/jehna/humanify/blob/002fd68d10f7ff13b80b7c305ad09c9c7f63bc47/src/index.ts#L56-L71 With the main logic being implemented here, and the actual SDK call within `codeToVariableRenames`: https://github.com/jehna/humanify/blob/002fd68d10f7ff13b80b7c305ad09c9c7f63bc47/src/openai/openai.ts#L14-L82 We can also see that it's using the `functions` config, which is deprecated now, and replaced by `tools`: - https://platform.openai.com/docs/api-reference/chat/create#chat-create-functions - https://platform.openai.com/docs/api-reference/chat/create#chat-create-tools --- **Edit:** Captured the above notes in a new more specific issue: - https://github.com/jehna/humanify/issues/19
Author
Owner

@JungleGenius commented on GitHub (Jul 5, 2024):

@0xdevalias @jehna

So I was able to get a little bit farther just by updating the model's instructions. The whole response_format: { type: "json_object" }, keeps getting rejected as invalid.

-            "Rename all Javascript variables and functions to have descriptive names based on their usage in the code.",
+            "Rename all Javascript variables and functions to have descriptive names based on their usage in the code. Instruct the model to produce valid JSON and to properly escape JSON attributes and values.",

But now I'm getting this.

/root/decode/humanify/src/openai/rename-variables-and-functions.ts:17
          const rename = toRename.find((r) => r.name === path.node.name);
                                                ^
TypeError: unknown file: Cannot read properties of undefined (reading 'name')
    at file:///root/decode/humanify/src/openai/rename-variables-and-functions.ts:1:382

Looks like we are getting null values? Not sure how to debug this. I'm just running this from the CLI and editing the source files with mcedit. I don't normally do Node.js development so I don't have a full Node.js IDE handy.

If this was a browser I would just do a console.log() and dump the return variables. I'm hoping to get this working because I need to reverse Closure Compiled file for a whole other client project.

<!-- gh-comment-id:2211196406 --> @JungleGenius commented on GitHub (Jul 5, 2024): @0xdevalias @jehna So I was able to get a little bit farther just by updating the model's instructions. The whole `response_format: { type: "json_object" },` keeps getting rejected as invalid. ``` - "Rename all Javascript variables and functions to have descriptive names based on their usage in the code.", + "Rename all Javascript variables and functions to have descriptive names based on their usage in the code. Instruct the model to produce valid JSON and to properly escape JSON attributes and values.", ``` But now I'm getting this. ``` /root/decode/humanify/src/openai/rename-variables-and-functions.ts:17 const rename = toRename.find((r) => r.name === path.node.name); ^ TypeError: unknown file: Cannot read properties of undefined (reading 'name') at file:///root/decode/humanify/src/openai/rename-variables-and-functions.ts:1:382 ``` Looks like we are getting null values? Not sure how to debug this. I'm just running this from the CLI and editing the source files with mcedit. I don't normally do Node.js development so I don't have a full Node.js IDE handy. If this was a browser I would just do a console.log() and dump the return variables. I'm hoping to get this working because I need to reverse Closure Compiled file for a whole other client project.
Author
Owner

@0xdevalias commented on GitHub (Jul 6, 2024):

The whole response_format: { type: "json_object" }, keeps getting rejected as invalid.

@JungleGenius Rejected by what part/with what error/etc? Do you think it relates to the SDK being too old to use that, as per my previous thoughts?

I also noticed that this project is using the node openai 3.3.0 package, whereas it's up to 4.52.3 now (that might not make a difference at all, but you never know)


If this was a browser I would just do a console.log() and dump the return variables.

@JungleGenius Pretty sure you should be able to do that here as well, and it would write to the terminal output.

You could also run the node script with debugging enabled and then connect to it from Chrome's debugger if you wanted something more powerful there:

<!-- gh-comment-id:2211609771 --> @0xdevalias commented on GitHub (Jul 6, 2024): > The whole `response_format: { type: "json_object" },` keeps getting rejected as invalid. @JungleGenius Rejected by what part/with what error/etc? Do you think it relates to the SDK being too old to use that, as per my previous thoughts? > I also noticed that this project is using the node openai 3.3.0 package, whereas it's up to 4.52.3 now (that might not make a difference at all, but you never know) - https://github.com/jehna/humanify/issues/19 --- > If this was a browser I would just do a `console.log()` and dump the return variables. @JungleGenius Pretty sure you should be able to do that here as well, and it would write to the terminal output. You could also run the node script with debugging enabled and then connect to it from Chrome's debugger if you wanted something more powerful there: - https://nodejs.org/en/learn/getting-started/debugging - https://blog.logrocket.com/debug-node-js-chrome-devtools-watchers/#using-chrome-devtools - https://blog.logrocket.com/debug-node-js-chrome-devtools-watchers/#using-built-in-node-js-debugger-watchers - https://nodejs.org/api/debugger.html#watchers
Author
Owner

@0xdevalias commented on GitHub (Aug 12, 2024):

This should now be fixed in v2 since there's the long awaited JSON mode with the new structured outputs. Please take a look and repoen if anything comes up

Originally posted by @jehna in https://github.com/jehna/humanify/issues/22#issuecomment-2282876269

See also:

<!-- gh-comment-id:2282985722 --> @0xdevalias commented on GitHub (Aug 12, 2024): > This should now be fixed in v2 since there's the long awaited JSON mode with the new [structured outputs](https://openai.com/index/introducing-structured-outputs-in-the-api/). Please take a look and repoen if anything comes up > > _Originally posted by @jehna in https://github.com/jehna/humanify/issues/22#issuecomment-2282876269_ See also: - https://github.com/jehna/humanify/issues/31
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/humanify#16
No description provided.