Middleware and a scalar Upload
to add support for GraphQL multipart requests (file uploads via queries and mutations) to various Node.js GraphQL servers.
Clients implementing the GraphQL multipart request spec upload files as Upload
scalar query or mutation variables. Their resolver values are promises that resolve file upload details for processing and storage. Files are typically streamed into cloud storage but may also be stored in the filesystem.
[!TIP]
First, check if there are GraphQL multipart request spec server implementations (most for Node.js integrate
graphql-upload
) that are more suitable for your environment than a manual setup.
To install graphql-upload
and its peer dependency graphql
with npm, run:
npm install graphql-upload graphql
Use the middleware graphqlUploadKoa
or graphqlUploadExpress
just before GraphQL middleware. Alternatively, use the function processRequest
to create custom middleware.
A schema built with separate SDL and resolvers (e.g. using the function makeExecutableSchema
from @graphql-tools/schema
) requires the scalar Upload
to be setup.
Then, the scalar Upload
can be used for query or mutation arguments. For how to use the scalar value in resolvers, see the documentation in the module GraphQLUpload.mjs
.
-
Apollo upload examples repo; a full stack demo of file uploads via GraphQL mutations using
apollo-server-koa
and@apollo/client
.
- The process must have both read and write access to the directory identified by
os.tmpdir()
. - The device requires sufficient disk space to buffer the expected number of concurrent upload requests.
- Promisify and await file upload streams in resolvers or the server will send a response to the client before uploads are complete, causing a disconnect.
- Handle file upload promise rejection and stream errors; uploads sometimes fail due to network connectivity issues or impatient users disconnecting.
- Process multiple uploads asynchronously with
Promise.all
or a more flexible solution such asPromise.allSettled
where an error in one does not reject them all. - Only use the function
createReadStream
before the resolver returns; late calls (e.g. in an unawaited async function or callback) throw an error. Existing streams can still be used after a response is sent, although there are few valid reasons for not awaiting their completion. - Use
stream.destroy()
when an incomplete stream is no longer needed, or temporary files may not get cleaned up.
The GraphQL multipart request spec allows a file to be used for multiple query or mutation variables (file deduplication), and for variables to be used in multiple places. GraphQL resolvers need to be able to manage independent file streams. As resolvers are executed asynchronously, it’s possible they will try to process files in a different order than received in the multipart request.
busboy
parses multipart request streams. Once the operations
and map
fields have been parsed, scalar Upload
values in the GraphQL operations are populated with promises, and the operations are passed down the middleware chain to GraphQL resolvers.
fs-capacitor
is used to buffer file uploads to the filesystem and coordinate simultaneous reading and writing. As soon as a file upload’s contents begins streaming, its data begins buffering to the filesystem and its associated promise resolves. GraphQL resolvers can then create new streams from the buffer by calling the function createReadStream
. The buffer is destroyed once all streams have ended or closed and the server has responded to the request. Any remaining buffer files will be cleaned when the process exits.
Supported runtime environments:
-
Node.js versions
^18.18.0 || ^20.9.0 || >=22.0.0
.
Projects must configure TypeScript to use types from the ECMAScript modules that have a // @ts-check
comment:
-
compilerOptions.allowJs
should betrue
. -
compilerOptions.maxNodeModuleJsDepth
should be reasonably large, e.g.10
. -
compilerOptions.module
should be"node16"
or"nodenext"
.
The npm package graphql-upload
features optimal JavaScript module design. It doesn’t have a main index module, so use deep imports from the ECMAScript modules that are exported via the package.json
field exports
: