Refactor routing in App component to enhance navigation and improve error handling by integrating dynamic routes and updating the NotFound route.
This commit is contained in:
191
node_modules/sharp/LICENSE
generated
vendored
Normal file
191
node_modules/sharp/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,191 @@
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction, and
|
||||
distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by the copyright
|
||||
owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all other entities
|
||||
that control, are controlled by, or are under common control with that entity.
|
||||
For the purposes of this definition, "control" means (i) the power, direct or
|
||||
indirect, to cause the direction or management of such entity, whether by
|
||||
contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity exercising
|
||||
permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications, including
|
||||
but not limited to software source code, documentation source, and configuration
|
||||
files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical transformation or
|
||||
translation of a Source form, including but not limited to compiled object code,
|
||||
generated documentation, and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or Object form, made
|
||||
available under the License, as indicated by a copyright notice that is included
|
||||
in or attached to the work (an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object form, that
|
||||
is based on (or derived from) the Work and for which the editorial revisions,
|
||||
annotations, elaborations, or other modifications represent, as a whole, an
|
||||
original work of authorship. For the purposes of this License, Derivative Works
|
||||
shall not include works that remain separable from, or merely link (or bind by
|
||||
name) to the interfaces of, the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including the original version
|
||||
of the Work and any modifications or additions to that Work or Derivative Works
|
||||
thereof, that is intentionally submitted to Licensor for inclusion in the Work
|
||||
by the copyright owner or by an individual or Legal Entity authorized to submit
|
||||
on behalf of the copyright owner. For the purposes of this definition,
|
||||
"submitted" means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems, and
|
||||
issue tracking systems that are managed by, or on behalf of, the Licensor for
|
||||
the purpose of discussing and improving the Work, but excluding communication
|
||||
that is conspicuously marked or otherwise designated in writing by the copyright
|
||||
owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity on behalf
|
||||
of whom a Contribution has been received by Licensor and subsequently
|
||||
incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License.
|
||||
|
||||
Subject to the terms and conditions of this License, each Contributor hereby
|
||||
grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free,
|
||||
irrevocable copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the Work and such
|
||||
Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License.
|
||||
|
||||
Subject to the terms and conditions of this License, each Contributor hereby
|
||||
grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free,
|
||||
irrevocable (except as stated in this section) patent license to make, have
|
||||
made, use, offer to sell, sell, import, and otherwise transfer the Work, where
|
||||
such license applies only to those patent claims licensable by such Contributor
|
||||
that are necessarily infringed by their Contribution(s) alone or by combination
|
||||
of their Contribution(s) with the Work to which such Contribution(s) was
|
||||
submitted. If You institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work or a
|
||||
Contribution incorporated within the Work constitutes direct or contributory
|
||||
patent infringement, then any patent licenses granted to You under this License
|
||||
for that Work shall terminate as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution.
|
||||
|
||||
You may reproduce and distribute copies of the Work or Derivative Works thereof
|
||||
in any medium, with or without modifications, and in Source or Object form,
|
||||
provided that You meet the following conditions:
|
||||
|
||||
You must give any other recipients of the Work or Derivative Works a copy of
|
||||
this License; and
|
||||
You must cause any modified files to carry prominent notices stating that You
|
||||
changed the files; and
|
||||
You must retain, in the Source form of any Derivative Works that You distribute,
|
||||
all copyright, patent, trademark, and attribution notices from the Source form
|
||||
of the Work, excluding those notices that do not pertain to any part of the
|
||||
Derivative Works; and
|
||||
If the Work includes a "NOTICE" text file as part of its distribution, then any
|
||||
Derivative Works that You distribute must include a readable copy of the
|
||||
attribution notices contained within such NOTICE file, excluding those notices
|
||||
that do not pertain to any part of the Derivative Works, in at least one of the
|
||||
following places: within a NOTICE text file distributed as part of the
|
||||
Derivative Works; within the Source form or documentation, if provided along
|
||||
with the Derivative Works; or, within a display generated by the Derivative
|
||||
Works, if and wherever such third-party notices normally appear. The contents of
|
||||
the NOTICE file are for informational purposes only and do not modify the
|
||||
License. You may add Your own attribution notices within Derivative Works that
|
||||
You distribute, alongside or as an addendum to the NOTICE text from the Work,
|
||||
provided that such additional attribution notices cannot be construed as
|
||||
modifying the License.
|
||||
You may add Your own copyright statement to Your modifications and may provide
|
||||
additional or different license terms and conditions for use, reproduction, or
|
||||
distribution of Your modifications, or for any such Derivative Works as a whole,
|
||||
provided Your use, reproduction, and distribution of the Work otherwise complies
|
||||
with the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions.
|
||||
|
||||
Unless You explicitly state otherwise, any Contribution intentionally submitted
|
||||
for inclusion in the Work by You to the Licensor shall be under the terms and
|
||||
conditions of this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify the terms of
|
||||
any separate license agreement you may have executed with Licensor regarding
|
||||
such Contributions.
|
||||
|
||||
6. Trademarks.
|
||||
|
||||
This License does not grant permission to use the trade names, trademarks,
|
||||
service marks, or product names of the Licensor, except as required for
|
||||
reasonable and customary use in describing the origin of the Work and
|
||||
reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty.
|
||||
|
||||
Unless required by applicable law or agreed to in writing, Licensor provides the
|
||||
Work (and each Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied,
|
||||
including, without limitation, any warranties or conditions of TITLE,
|
||||
NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are
|
||||
solely responsible for determining the appropriateness of using or
|
||||
redistributing the Work and assume any risks associated with Your exercise of
|
||||
permissions under this License.
|
||||
|
||||
8. Limitation of Liability.
|
||||
|
||||
In no event and under no legal theory, whether in tort (including negligence),
|
||||
contract, or otherwise, unless required by applicable law (such as deliberate
|
||||
and grossly negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special, incidental,
|
||||
or consequential damages of any character arising as a result of this License or
|
||||
out of the use or inability to use the Work (including but not limited to
|
||||
damages for loss of goodwill, work stoppage, computer failure or malfunction, or
|
||||
any and all other commercial damages or losses), even if such Contributor has
|
||||
been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability.
|
||||
|
||||
While redistributing the Work or Derivative Works thereof, You may choose to
|
||||
offer, and charge a fee for, acceptance of support, warranty, indemnity, or
|
||||
other liability obligations and/or rights consistent with this License. However,
|
||||
in accepting such obligations, You may act only on Your own behalf and on Your
|
||||
sole responsibility, not on behalf of any other Contributor, and only if You
|
||||
agree to indemnify, defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason of your
|
||||
accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work
|
||||
|
||||
To apply the Apache License to your work, attach the following boilerplate
|
||||
notice, with the fields enclosed by brackets "[]" replaced with your own
|
||||
identifying information. (Don't include the brackets!) The text should be
|
||||
enclosed in the appropriate comment syntax for the file format. We also
|
||||
recommend that a file or class name and description of purpose be included on
|
||||
the same "printed page" as the copyright notice for easier identification within
|
||||
third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
118
node_modules/sharp/README.md
generated
vendored
Normal file
118
node_modules/sharp/README.md
generated
vendored
Normal file
@@ -0,0 +1,118 @@
|
||||
# sharp
|
||||
|
||||
<img src="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/sharp-logo.svg" width="160" height="160" alt="sharp logo" align="right">
|
||||
|
||||
The typical use case for this high speed Node-API module
|
||||
is to convert large images in common formats to
|
||||
smaller, web-friendly JPEG, PNG, WebP, GIF and AVIF images of varying dimensions.
|
||||
|
||||
It can be used with all JavaScript runtimes
|
||||
that provide support for Node-API v9, including
|
||||
Node.js (^18.17.0 or >= 20.3.0), Deno and Bun.
|
||||
|
||||
Resizing an image is typically 4x-5x faster than using the
|
||||
quickest ImageMagick and GraphicsMagick settings
|
||||
due to its use of [libvips](https://github.com/libvips/libvips).
|
||||
|
||||
Colour spaces, embedded ICC profiles and alpha transparency channels are all handled correctly.
|
||||
Lanczos resampling ensures quality is not sacrificed for speed.
|
||||
|
||||
As well as image resizing, operations such as
|
||||
rotation, extraction, compositing and gamma correction are available.
|
||||
|
||||
Most modern macOS, Windows and Linux systems
|
||||
do not require any additional install or runtime dependencies.
|
||||
|
||||
## Documentation
|
||||
|
||||
Visit [sharp.pixelplumbing.com](https://sharp.pixelplumbing.com/) for complete
|
||||
[installation instructions](https://sharp.pixelplumbing.com/install),
|
||||
[API documentation](https://sharp.pixelplumbing.com/api-constructor),
|
||||
[benchmark tests](https://sharp.pixelplumbing.com/performance) and
|
||||
[changelog](https://sharp.pixelplumbing.com/changelog).
|
||||
|
||||
## Examples
|
||||
|
||||
```sh
|
||||
npm install sharp
|
||||
```
|
||||
|
||||
```javascript
|
||||
const sharp = require('sharp');
|
||||
```
|
||||
|
||||
### Callback
|
||||
|
||||
```javascript
|
||||
sharp(inputBuffer)
|
||||
.resize(320, 240)
|
||||
.toFile('output.webp', (err, info) => { ... });
|
||||
```
|
||||
|
||||
### Promise
|
||||
|
||||
```javascript
|
||||
sharp('input.jpg')
|
||||
.rotate()
|
||||
.resize(200)
|
||||
.jpeg({ mozjpeg: true })
|
||||
.toBuffer()
|
||||
.then( data => { ... })
|
||||
.catch( err => { ... });
|
||||
```
|
||||
|
||||
### Async/await
|
||||
|
||||
```javascript
|
||||
const semiTransparentRedPng = await sharp({
|
||||
create: {
|
||||
width: 48,
|
||||
height: 48,
|
||||
channels: 4,
|
||||
background: { r: 255, g: 0, b: 0, alpha: 0.5 }
|
||||
}
|
||||
})
|
||||
.png()
|
||||
.toBuffer();
|
||||
```
|
||||
|
||||
### Stream
|
||||
|
||||
```javascript
|
||||
const roundedCorners = Buffer.from(
|
||||
'<svg><rect x="0" y="0" width="200" height="200" rx="50" ry="50"/></svg>'
|
||||
);
|
||||
|
||||
const roundedCornerResizer =
|
||||
sharp()
|
||||
.resize(200, 200)
|
||||
.composite([{
|
||||
input: roundedCorners,
|
||||
blend: 'dest-in'
|
||||
}])
|
||||
.png();
|
||||
|
||||
readableStream
|
||||
.pipe(roundedCornerResizer)
|
||||
.pipe(writableStream);
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
A [guide for contributors](https://github.com/lovell/sharp/blob/main/.github/CONTRIBUTING.md)
|
||||
covers reporting bugs, requesting features and submitting code changes.
|
||||
|
||||
## Licensing
|
||||
|
||||
Copyright 2013 Lovell Fuller and others.
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
[https://www.apache.org/licenses/LICENSE-2.0](https://www.apache.org/licenses/LICENSE-2.0)
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
41
node_modules/sharp/install/check.js
generated
vendored
Normal file
41
node_modules/sharp/install/check.js
generated
vendored
Normal file
@@ -0,0 +1,41 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
try {
|
||||
const { useGlobalLibvips, globalLibvipsVersion, log, spawnRebuild } = require('../lib/libvips');
|
||||
|
||||
const buildFromSource = (msg) => {
|
||||
log(msg);
|
||||
log('Attempting to build from source via node-gyp');
|
||||
try {
|
||||
const addonApi = require('node-addon-api');
|
||||
log(`Found node-addon-api ${addonApi.version || ''}`);
|
||||
} catch (err) {
|
||||
log('Please add node-addon-api to your dependencies');
|
||||
return;
|
||||
}
|
||||
try {
|
||||
const gyp = require('node-gyp');
|
||||
log(`Found node-gyp ${gyp().version}`);
|
||||
} catch (err) {
|
||||
log('Please add node-gyp to your dependencies');
|
||||
return;
|
||||
}
|
||||
log('See https://sharp.pixelplumbing.com/install#building-from-source');
|
||||
const status = spawnRebuild();
|
||||
if (status !== 0) {
|
||||
process.exit(status);
|
||||
}
|
||||
};
|
||||
|
||||
if (useGlobalLibvips(log)) {
|
||||
buildFromSource(`Detected globally-installed libvips v${globalLibvipsVersion()}`);
|
||||
} else if (process.env.npm_config_build_from_source) {
|
||||
buildFromSource('Detected --build-from-source flag');
|
||||
}
|
||||
} catch (err) {
|
||||
const summary = err.message.split(/\n/).slice(0, 1);
|
||||
console.log(`sharp: skipping install check: ${summary}`);
|
||||
}
|
174
node_modules/sharp/lib/channel.js
generated
vendored
Normal file
174
node_modules/sharp/lib/channel.js
generated
vendored
Normal file
@@ -0,0 +1,174 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
const is = require('./is');
|
||||
|
||||
/**
|
||||
* Boolean operations for bandbool.
|
||||
* @private
|
||||
*/
|
||||
const bool = {
|
||||
and: 'and',
|
||||
or: 'or',
|
||||
eor: 'eor'
|
||||
};
|
||||
|
||||
/**
|
||||
* Remove alpha channel, if any. This is a no-op if the image does not have an alpha channel.
|
||||
*
|
||||
* See also {@link /api-operation#flatten|flatten}.
|
||||
*
|
||||
* @example
|
||||
* sharp('rgba.png')
|
||||
* .removeAlpha()
|
||||
* .toFile('rgb.png', function(err, info) {
|
||||
* // rgb.png is a 3 channel image without an alpha channel
|
||||
* });
|
||||
*
|
||||
* @returns {Sharp}
|
||||
*/
|
||||
function removeAlpha () {
|
||||
this.options.removeAlpha = true;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Ensure the output image has an alpha transparency channel.
|
||||
* If missing, the added alpha channel will have the specified
|
||||
* transparency level, defaulting to fully-opaque (1).
|
||||
* This is a no-op if the image already has an alpha channel.
|
||||
*
|
||||
* @since 0.21.2
|
||||
*
|
||||
* @example
|
||||
* // rgba.png will be a 4 channel image with a fully-opaque alpha channel
|
||||
* await sharp('rgb.jpg')
|
||||
* .ensureAlpha()
|
||||
* .toFile('rgba.png')
|
||||
*
|
||||
* @example
|
||||
* // rgba is a 4 channel image with a fully-transparent alpha channel
|
||||
* const rgba = await sharp(rgb)
|
||||
* .ensureAlpha(0)
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {number} [alpha=1] - alpha transparency level (0=fully-transparent, 1=fully-opaque)
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid alpha transparency level
|
||||
*/
|
||||
function ensureAlpha (alpha) {
|
||||
if (is.defined(alpha)) {
|
||||
if (is.number(alpha) && is.inRange(alpha, 0, 1)) {
|
||||
this.options.ensureAlpha = alpha;
|
||||
} else {
|
||||
throw is.invalidParameterError('alpha', 'number between 0 and 1', alpha);
|
||||
}
|
||||
} else {
|
||||
this.options.ensureAlpha = 1;
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract a single channel from a multi-channel image.
|
||||
*
|
||||
* @example
|
||||
* // green.jpg is a greyscale image containing the green channel of the input
|
||||
* await sharp(input)
|
||||
* .extractChannel('green')
|
||||
* .toFile('green.jpg');
|
||||
*
|
||||
* @example
|
||||
* // red1 is the red value of the first pixel, red2 the second pixel etc.
|
||||
* const [red1, red2, ...] = await sharp(input)
|
||||
* .extractChannel(0)
|
||||
* .raw()
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {number|string} channel - zero-indexed channel/band number to extract, or `red`, `green`, `blue` or `alpha`.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid channel
|
||||
*/
|
||||
function extractChannel (channel) {
|
||||
const channelMap = { red: 0, green: 1, blue: 2, alpha: 3 };
|
||||
if (Object.keys(channelMap).includes(channel)) {
|
||||
channel = channelMap[channel];
|
||||
}
|
||||
if (is.integer(channel) && is.inRange(channel, 0, 4)) {
|
||||
this.options.extractChannel = channel;
|
||||
} else {
|
||||
throw is.invalidParameterError('channel', 'integer or one of: red, green, blue, alpha', channel);
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Join one or more channels to the image.
|
||||
* The meaning of the added channels depends on the output colourspace, set with `toColourspace()`.
|
||||
* By default the output image will be web-friendly sRGB, with additional channels interpreted as alpha channels.
|
||||
* Channel ordering follows vips convention:
|
||||
* - sRGB: 0: Red, 1: Green, 2: Blue, 3: Alpha.
|
||||
* - CMYK: 0: Magenta, 1: Cyan, 2: Yellow, 3: Black, 4: Alpha.
|
||||
*
|
||||
* Buffers may be any of the image formats supported by sharp.
|
||||
* For raw pixel input, the `options` object should contain a `raw` attribute, which follows the format of the attribute of the same name in the `sharp()` constructor.
|
||||
*
|
||||
* @param {Array<string|Buffer>|string|Buffer} images - one or more images (file paths, Buffers).
|
||||
* @param {Object} options - image options, see `sharp()` constructor.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function joinChannel (images, options) {
|
||||
if (Array.isArray(images)) {
|
||||
images.forEach(function (image) {
|
||||
this.options.joinChannelIn.push(this._createInputDescriptor(image, options));
|
||||
}, this);
|
||||
} else {
|
||||
this.options.joinChannelIn.push(this._createInputDescriptor(images, options));
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Perform a bitwise boolean operation on all input image channels (bands) to produce a single channel output image.
|
||||
*
|
||||
* @example
|
||||
* sharp('3-channel-rgb-input.png')
|
||||
* .bandbool(sharp.bool.and)
|
||||
* .toFile('1-channel-output.png', function (err, info) {
|
||||
* // The output will be a single channel image where each pixel `P = R & G & B`.
|
||||
* // If `I(1,1) = [247, 170, 14] = [0b11110111, 0b10101010, 0b00001111]`
|
||||
* // then `O(1,1) = 0b11110111 & 0b10101010 & 0b00001111 = 0b00000010 = 2`.
|
||||
* });
|
||||
*
|
||||
* @param {string} boolOp - one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function bandbool (boolOp) {
|
||||
if (is.string(boolOp) && is.inArray(boolOp, ['and', 'or', 'eor'])) {
|
||||
this.options.bandBoolOp = boolOp;
|
||||
} else {
|
||||
throw is.invalidParameterError('boolOp', 'one of: and, or, eor', boolOp);
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Decorate the Sharp prototype with channel-related functions.
|
||||
* @private
|
||||
*/
|
||||
module.exports = function (Sharp) {
|
||||
Object.assign(Sharp.prototype, {
|
||||
// Public instance functions
|
||||
removeAlpha,
|
||||
ensureAlpha,
|
||||
extractChannel,
|
||||
joinChannel,
|
||||
bandbool
|
||||
});
|
||||
// Class attributes
|
||||
Sharp.bool = bool;
|
||||
};
|
180
node_modules/sharp/lib/colour.js
generated
vendored
Normal file
180
node_modules/sharp/lib/colour.js
generated
vendored
Normal file
@@ -0,0 +1,180 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
const color = require('color');
|
||||
const is = require('./is');
|
||||
|
||||
/**
|
||||
* Colourspaces.
|
||||
* @private
|
||||
*/
|
||||
const colourspace = {
|
||||
multiband: 'multiband',
|
||||
'b-w': 'b-w',
|
||||
bw: 'b-w',
|
||||
cmyk: 'cmyk',
|
||||
srgb: 'srgb'
|
||||
};
|
||||
|
||||
/**
|
||||
* Tint the image using the provided colour.
|
||||
* An alpha channel may be present and will be unchanged by the operation.
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input)
|
||||
* .tint({ r: 255, g: 240, b: 16 })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {string|Object} tint - Parsed by the [color](https://www.npmjs.org/package/color) module.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameter
|
||||
*/
|
||||
function tint (tint) {
|
||||
this._setBackgroundColourOption('tint', tint);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert to 8-bit greyscale; 256 shades of grey.
|
||||
* This is a linear operation. If the input image is in a non-linear colour space such as sRGB, use `gamma()` with `greyscale()` for the best results.
|
||||
* By default the output image will be web-friendly sRGB and contain three (identical) color channels.
|
||||
* This may be overridden by other sharp operations such as `toColourspace('b-w')`,
|
||||
* which will produce an output image containing one color channel.
|
||||
* An alpha channel may be present, and will be unchanged by the operation.
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input).greyscale().toBuffer();
|
||||
*
|
||||
* @param {Boolean} [greyscale=true]
|
||||
* @returns {Sharp}
|
||||
*/
|
||||
function greyscale (greyscale) {
|
||||
this.options.greyscale = is.bool(greyscale) ? greyscale : true;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Alternative spelling of `greyscale`.
|
||||
* @param {Boolean} [grayscale=true]
|
||||
* @returns {Sharp}
|
||||
*/
|
||||
function grayscale (grayscale) {
|
||||
return this.greyscale(grayscale);
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the pipeline colourspace.
|
||||
*
|
||||
* The input image will be converted to the provided colourspace at the start of the pipeline.
|
||||
* All operations will use this colourspace before converting to the output colourspace,
|
||||
* as defined by {@link #tocolourspace|toColourspace}.
|
||||
*
|
||||
* @since 0.29.0
|
||||
*
|
||||
* @example
|
||||
* // Run pipeline in 16 bits per channel RGB while converting final result to 8 bits per channel sRGB.
|
||||
* await sharp(input)
|
||||
* .pipelineColourspace('rgb16')
|
||||
* .toColourspace('srgb')
|
||||
* .toFile('16bpc-pipeline-to-8bpc-output.png')
|
||||
*
|
||||
* @param {string} [colourspace] - pipeline colourspace e.g. `rgb16`, `scrgb`, `lab`, `grey16` [...](https://github.com/libvips/libvips/blob/41cff4e9d0838498487a00623462204eb10ee5b8/libvips/iofuncs/enumtypes.c#L774)
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function pipelineColourspace (colourspace) {
|
||||
if (!is.string(colourspace)) {
|
||||
throw is.invalidParameterError('colourspace', 'string', colourspace);
|
||||
}
|
||||
this.options.colourspacePipeline = colourspace;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Alternative spelling of `pipelineColourspace`.
|
||||
* @param {string} [colorspace] - pipeline colorspace.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function pipelineColorspace (colorspace) {
|
||||
return this.pipelineColourspace(colorspace);
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the output colourspace.
|
||||
* By default output image will be web-friendly sRGB, with additional channels interpreted as alpha channels.
|
||||
*
|
||||
* @example
|
||||
* // Output 16 bits per pixel RGB
|
||||
* await sharp(input)
|
||||
* .toColourspace('rgb16')
|
||||
* .toFile('16-bpp.png')
|
||||
*
|
||||
* @param {string} [colourspace] - output colourspace e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://github.com/libvips/libvips/blob/3c0bfdf74ce1dc37a6429bed47fa76f16e2cd70a/libvips/iofuncs/enumtypes.c#L777-L794)
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function toColourspace (colourspace) {
|
||||
if (!is.string(colourspace)) {
|
||||
throw is.invalidParameterError('colourspace', 'string', colourspace);
|
||||
}
|
||||
this.options.colourspace = colourspace;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Alternative spelling of `toColourspace`.
|
||||
* @param {string} [colorspace] - output colorspace.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function toColorspace (colorspace) {
|
||||
return this.toColourspace(colorspace);
|
||||
}
|
||||
|
||||
/**
|
||||
* Update a colour attribute of the this.options Object.
|
||||
* @private
|
||||
* @param {string} key
|
||||
* @param {string|Object} value
|
||||
* @throws {Error} Invalid value
|
||||
*/
|
||||
function _setBackgroundColourOption (key, value) {
|
||||
if (is.defined(value)) {
|
||||
if (is.object(value) || is.string(value)) {
|
||||
const colour = color(value);
|
||||
this.options[key] = [
|
||||
colour.red(),
|
||||
colour.green(),
|
||||
colour.blue(),
|
||||
Math.round(colour.alpha() * 255)
|
||||
];
|
||||
} else {
|
||||
throw is.invalidParameterError('background', 'object or string', value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Decorate the Sharp prototype with colour-related functions.
|
||||
* @private
|
||||
*/
|
||||
module.exports = function (Sharp) {
|
||||
Object.assign(Sharp.prototype, {
|
||||
// Public
|
||||
tint,
|
||||
greyscale,
|
||||
grayscale,
|
||||
pipelineColourspace,
|
||||
pipelineColorspace,
|
||||
toColourspace,
|
||||
toColorspace,
|
||||
// Private
|
||||
_setBackgroundColourOption
|
||||
});
|
||||
// Class attributes
|
||||
Sharp.colourspace = colourspace;
|
||||
Sharp.colorspace = colourspace;
|
||||
};
|
210
node_modules/sharp/lib/composite.js
generated
vendored
Normal file
210
node_modules/sharp/lib/composite.js
generated
vendored
Normal file
@@ -0,0 +1,210 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
const is = require('./is');
|
||||
|
||||
/**
|
||||
* Blend modes.
|
||||
* @member
|
||||
* @private
|
||||
*/
|
||||
const blend = {
|
||||
clear: 'clear',
|
||||
source: 'source',
|
||||
over: 'over',
|
||||
in: 'in',
|
||||
out: 'out',
|
||||
atop: 'atop',
|
||||
dest: 'dest',
|
||||
'dest-over': 'dest-over',
|
||||
'dest-in': 'dest-in',
|
||||
'dest-out': 'dest-out',
|
||||
'dest-atop': 'dest-atop',
|
||||
xor: 'xor',
|
||||
add: 'add',
|
||||
saturate: 'saturate',
|
||||
multiply: 'multiply',
|
||||
screen: 'screen',
|
||||
overlay: 'overlay',
|
||||
darken: 'darken',
|
||||
lighten: 'lighten',
|
||||
'colour-dodge': 'colour-dodge',
|
||||
'color-dodge': 'colour-dodge',
|
||||
'colour-burn': 'colour-burn',
|
||||
'color-burn': 'colour-burn',
|
||||
'hard-light': 'hard-light',
|
||||
'soft-light': 'soft-light',
|
||||
difference: 'difference',
|
||||
exclusion: 'exclusion'
|
||||
};
|
||||
|
||||
/**
|
||||
* Composite image(s) over the processed (resized, extracted etc.) image.
|
||||
*
|
||||
* The images to composite must be the same size or smaller than the processed image.
|
||||
* If both `top` and `left` options are provided, they take precedence over `gravity`.
|
||||
*
|
||||
* Any resize, rotate or extract operations in the same processing pipeline
|
||||
* will always be applied to the input image before composition.
|
||||
*
|
||||
* The `blend` option can be one of `clear`, `source`, `over`, `in`, `out`, `atop`,
|
||||
* `dest`, `dest-over`, `dest-in`, `dest-out`, `dest-atop`,
|
||||
* `xor`, `add`, `saturate`, `multiply`, `screen`, `overlay`, `darken`, `lighten`,
|
||||
* `colour-dodge`, `color-dodge`, `colour-burn`,`color-burn`,
|
||||
* `hard-light`, `soft-light`, `difference`, `exclusion`.
|
||||
*
|
||||
* More information about blend modes can be found at
|
||||
* https://www.libvips.org/API/current/libvips-conversion.html#VipsBlendMode
|
||||
* and https://www.cairographics.org/operators/
|
||||
*
|
||||
* @since 0.22.0
|
||||
*
|
||||
* @example
|
||||
* await sharp(background)
|
||||
* .composite([
|
||||
* { input: layer1, gravity: 'northwest' },
|
||||
* { input: layer2, gravity: 'southeast' },
|
||||
* ])
|
||||
* .toFile('combined.png');
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp('input.gif', { animated: true })
|
||||
* .composite([
|
||||
* { input: 'overlay.png', tile: true, blend: 'saturate' }
|
||||
* ])
|
||||
* .toBuffer();
|
||||
*
|
||||
* @example
|
||||
* sharp('input.png')
|
||||
* .rotate(180)
|
||||
* .resize(300)
|
||||
* .flatten( { background: '#ff6600' } )
|
||||
* .composite([{ input: 'overlay.png', gravity: 'southeast' }])
|
||||
* .sharpen()
|
||||
* .withMetadata()
|
||||
* .webp( { quality: 90 } )
|
||||
* .toBuffer()
|
||||
* .then(function(outputBuffer) {
|
||||
* // outputBuffer contains upside down, 300px wide, alpha channel flattened
|
||||
* // onto orange background, composited with overlay.png with SE gravity,
|
||||
* // sharpened, with metadata, 90% quality WebP image data. Phew!
|
||||
* });
|
||||
*
|
||||
* @param {Object[]} images - Ordered list of images to composite
|
||||
* @param {Buffer|String} [images[].input] - Buffer containing image data, String containing the path to an image file, or Create object (see below)
|
||||
* @param {Object} [images[].input.create] - describes a blank overlay to be created.
|
||||
* @param {Number} [images[].input.create.width]
|
||||
* @param {Number} [images[].input.create.height]
|
||||
* @param {Number} [images[].input.create.channels] - 3-4
|
||||
* @param {String|Object} [images[].input.create.background] - parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha.
|
||||
* @param {Object} [images[].input.text] - describes a new text image to be created.
|
||||
* @param {string} [images[].input.text.text] - text to render as a UTF-8 string. It can contain Pango markup, for example `<i>Le</i>Monde`.
|
||||
* @param {string} [images[].input.text.font] - font name to render with.
|
||||
* @param {string} [images[].input.text.fontfile] - absolute filesystem path to a font file that can be used by `font`.
|
||||
* @param {number} [images[].input.text.width=0] - integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries.
|
||||
* @param {number} [images[].input.text.height=0] - integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0.
|
||||
* @param {string} [images[].input.text.align='left'] - text alignment (`'left'`, `'centre'`, `'center'`, `'right'`).
|
||||
* @param {boolean} [images[].input.text.justify=false] - set this to true to apply justification to the text.
|
||||
* @param {number} [images[].input.text.dpi=72] - the resolution (size) at which to render the text. Does not take effect if `height` is specified.
|
||||
* @param {boolean} [images[].input.text.rgba=false] - set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for Pango markup features like `<span foreground="red">Red!</span>`.
|
||||
* @param {number} [images[].input.text.spacing=0] - text line height in points. Will use the font line height if none is specified.
|
||||
* @param {String} [images[].blend='over'] - how to blend this image with the image below.
|
||||
* @param {String} [images[].gravity='centre'] - gravity at which to place the overlay.
|
||||
* @param {Number} [images[].top] - the pixel offset from the top edge.
|
||||
* @param {Number} [images[].left] - the pixel offset from the left edge.
|
||||
* @param {Boolean} [images[].tile=false] - set to true to repeat the overlay image across the entire image with the given `gravity`.
|
||||
* @param {Boolean} [images[].premultiplied=false] - set to true to avoid premultiplying the image below. Equivalent to the `--premultiplied` vips option.
|
||||
* @param {Number} [images[].density=72] - number representing the DPI for vector overlay image.
|
||||
* @param {Object} [images[].raw] - describes overlay when using raw pixel data.
|
||||
* @param {Number} [images[].raw.width]
|
||||
* @param {Number} [images[].raw.height]
|
||||
* @param {Number} [images[].raw.channels]
|
||||
* @param {boolean} [images[].animated=false] - Set to `true` to read all frames/pages of an animated image.
|
||||
* @param {string} [images[].failOn='warning'] - @see {@link /api-constructor#parameters|constructor parameters}
|
||||
* @param {number|boolean} [images[].limitInputPixels=268402689] - @see {@link /api-constructor#parameters|constructor parameters}
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function composite (images) {
|
||||
if (!Array.isArray(images)) {
|
||||
throw is.invalidParameterError('images to composite', 'array', images);
|
||||
}
|
||||
this.options.composite = images.map(image => {
|
||||
if (!is.object(image)) {
|
||||
throw is.invalidParameterError('image to composite', 'object', image);
|
||||
}
|
||||
const inputOptions = this._inputOptionsFromObject(image);
|
||||
const composite = {
|
||||
input: this._createInputDescriptor(image.input, inputOptions, { allowStream: false }),
|
||||
blend: 'over',
|
||||
tile: false,
|
||||
left: 0,
|
||||
top: 0,
|
||||
hasOffset: false,
|
||||
gravity: 0,
|
||||
premultiplied: false
|
||||
};
|
||||
if (is.defined(image.blend)) {
|
||||
if (is.string(blend[image.blend])) {
|
||||
composite.blend = blend[image.blend];
|
||||
} else {
|
||||
throw is.invalidParameterError('blend', 'valid blend name', image.blend);
|
||||
}
|
||||
}
|
||||
if (is.defined(image.tile)) {
|
||||
if (is.bool(image.tile)) {
|
||||
composite.tile = image.tile;
|
||||
} else {
|
||||
throw is.invalidParameterError('tile', 'boolean', image.tile);
|
||||
}
|
||||
}
|
||||
if (is.defined(image.left)) {
|
||||
if (is.integer(image.left)) {
|
||||
composite.left = image.left;
|
||||
} else {
|
||||
throw is.invalidParameterError('left', 'integer', image.left);
|
||||
}
|
||||
}
|
||||
if (is.defined(image.top)) {
|
||||
if (is.integer(image.top)) {
|
||||
composite.top = image.top;
|
||||
} else {
|
||||
throw is.invalidParameterError('top', 'integer', image.top);
|
||||
}
|
||||
}
|
||||
if (is.defined(image.top) !== is.defined(image.left)) {
|
||||
throw new Error('Expected both left and top to be set');
|
||||
} else {
|
||||
composite.hasOffset = is.integer(image.top) && is.integer(image.left);
|
||||
}
|
||||
if (is.defined(image.gravity)) {
|
||||
if (is.integer(image.gravity) && is.inRange(image.gravity, 0, 8)) {
|
||||
composite.gravity = image.gravity;
|
||||
} else if (is.string(image.gravity) && is.integer(this.constructor.gravity[image.gravity])) {
|
||||
composite.gravity = this.constructor.gravity[image.gravity];
|
||||
} else {
|
||||
throw is.invalidParameterError('gravity', 'valid gravity', image.gravity);
|
||||
}
|
||||
}
|
||||
if (is.defined(image.premultiplied)) {
|
||||
if (is.bool(image.premultiplied)) {
|
||||
composite.premultiplied = image.premultiplied;
|
||||
} else {
|
||||
throw is.invalidParameterError('premultiplied', 'boolean', image.premultiplied);
|
||||
}
|
||||
}
|
||||
return composite;
|
||||
});
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Decorate the Sharp prototype with composite-related functions.
|
||||
* @private
|
||||
*/
|
||||
module.exports = function (Sharp) {
|
||||
Sharp.prototype.composite = composite;
|
||||
Sharp.blend = blend;
|
||||
};
|
452
node_modules/sharp/lib/constructor.js
generated
vendored
Normal file
452
node_modules/sharp/lib/constructor.js
generated
vendored
Normal file
@@ -0,0 +1,452 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
const util = require('node:util');
|
||||
const stream = require('node:stream');
|
||||
const is = require('./is');
|
||||
|
||||
require('./sharp');
|
||||
|
||||
// Use NODE_DEBUG=sharp to enable libvips warnings
|
||||
const debuglog = util.debuglog('sharp');
|
||||
|
||||
/**
|
||||
* Constructor factory to create an instance of `sharp`, to which further methods are chained.
|
||||
*
|
||||
* JPEG, PNG, WebP, GIF, AVIF or TIFF format image data can be streamed out from this object.
|
||||
* When using Stream based output, derived attributes are available from the `info` event.
|
||||
*
|
||||
* Non-critical problems encountered during processing are emitted as `warning` events.
|
||||
*
|
||||
* Implements the [stream.Duplex](http://nodejs.org/api/stream.html#stream_class_stream_duplex) class.
|
||||
*
|
||||
* When loading more than one page/frame of an animated image,
|
||||
* these are combined as a vertically-stacked "toilet roll" image
|
||||
* where the overall height is the `pageHeight` multiplied by the number of `pages`.
|
||||
*
|
||||
* @constructs Sharp
|
||||
*
|
||||
* @emits Sharp#info
|
||||
* @emits Sharp#warning
|
||||
*
|
||||
* @example
|
||||
* sharp('input.jpg')
|
||||
* .resize(300, 200)
|
||||
* .toFile('output.jpg', function(err) {
|
||||
* // output.jpg is a 300 pixels wide and 200 pixels high image
|
||||
* // containing a scaled and cropped version of input.jpg
|
||||
* });
|
||||
*
|
||||
* @example
|
||||
* // Read image data from remote URL,
|
||||
* // resize to 300 pixels wide,
|
||||
* // emit an 'info' event with calculated dimensions
|
||||
* // and finally write image data to writableStream
|
||||
* const { body } = fetch('https://...');
|
||||
* const readableStream = Readable.fromWeb(body);
|
||||
* const transformer = sharp()
|
||||
* .resize(300)
|
||||
* .on('info', ({ height }) => {
|
||||
* console.log(`Image height is ${height}`);
|
||||
* });
|
||||
* readableStream.pipe(transformer).pipe(writableStream);
|
||||
*
|
||||
* @example
|
||||
* // Create a blank 300x200 PNG image of semi-translucent red pixels
|
||||
* sharp({
|
||||
* create: {
|
||||
* width: 300,
|
||||
* height: 200,
|
||||
* channels: 4,
|
||||
* background: { r: 255, g: 0, b: 0, alpha: 0.5 }
|
||||
* }
|
||||
* })
|
||||
* .png()
|
||||
* .toBuffer()
|
||||
* .then( ... );
|
||||
*
|
||||
* @example
|
||||
* // Convert an animated GIF to an animated WebP
|
||||
* await sharp('in.gif', { animated: true }).toFile('out.webp');
|
||||
*
|
||||
* @example
|
||||
* // Read a raw array of pixels and save it to a png
|
||||
* const input = Uint8Array.from([255, 255, 255, 0, 0, 0]); // or Uint8ClampedArray
|
||||
* const image = sharp(input, {
|
||||
* // because the input does not contain its dimensions or how many channels it has
|
||||
* // we need to specify it in the constructor options
|
||||
* raw: {
|
||||
* width: 2,
|
||||
* height: 1,
|
||||
* channels: 3
|
||||
* }
|
||||
* });
|
||||
* await image.toFile('my-two-pixels.png');
|
||||
*
|
||||
* @example
|
||||
* // Generate RGB Gaussian noise
|
||||
* await sharp({
|
||||
* create: {
|
||||
* width: 300,
|
||||
* height: 200,
|
||||
* channels: 3,
|
||||
* noise: {
|
||||
* type: 'gaussian',
|
||||
* mean: 128,
|
||||
* sigma: 30
|
||||
* }
|
||||
* }
|
||||
* }).toFile('noise.png');
|
||||
*
|
||||
* @example
|
||||
* // Generate an image from text
|
||||
* await sharp({
|
||||
* text: {
|
||||
* text: 'Hello, world!',
|
||||
* width: 400, // max width
|
||||
* height: 300 // max height
|
||||
* }
|
||||
* }).toFile('text_bw.png');
|
||||
*
|
||||
* @example
|
||||
* // Generate an rgba image from text using pango markup and font
|
||||
* await sharp({
|
||||
* text: {
|
||||
* text: '<span foreground="red">Red!</span><span background="cyan">blue</span>',
|
||||
* font: 'sans',
|
||||
* rgba: true,
|
||||
* dpi: 300
|
||||
* }
|
||||
* }).toFile('text_rgba.png');
|
||||
*
|
||||
* @param {(Buffer|ArrayBuffer|Uint8Array|Uint8ClampedArray|Int8Array|Uint16Array|Int16Array|Uint32Array|Int32Array|Float32Array|Float64Array|string)} [input] - if present, can be
|
||||
* a Buffer / ArrayBuffer / Uint8Array / Uint8ClampedArray containing JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image data, or
|
||||
* a TypedArray containing raw pixel image data, or
|
||||
* a String containing the filesystem path to an JPEG, PNG, WebP, AVIF, GIF, SVG or TIFF image file.
|
||||
* JPEG, PNG, WebP, AVIF, GIF, SVG, TIFF or raw pixel image data can be streamed into the object when not present.
|
||||
* @param {Object} [options] - if present, is an Object with optional attributes.
|
||||
* @param {string} [options.failOn='warning'] - When to abort processing of invalid pixel data, one of (in order of sensitivity, least to most): 'none', 'truncated', 'error', 'warning'. Higher levels imply lower levels. Invalid metadata will always abort.
|
||||
* @param {number|boolean} [options.limitInputPixels=268402689] - Do not process input images where the number of pixels
|
||||
* (width x height) exceeds this limit. Assumes image dimensions contained in the input metadata can be trusted.
|
||||
* An integral Number of pixels, zero or false to remove limit, true to use default limit of 268402689 (0x3FFF x 0x3FFF).
|
||||
* @param {boolean} [options.unlimited=false] - Set this to `true` to remove safety features that help prevent memory exhaustion (JPEG, PNG, SVG, HEIF).
|
||||
* @param {boolean} [options.sequentialRead=true] - Set this to `false` to use random access rather than sequential read. Some operations will do this automatically.
|
||||
* @param {number} [options.density=72] - number representing the DPI for vector images in the range 1 to 100000.
|
||||
* @param {number} [options.ignoreIcc=false] - should the embedded ICC profile, if any, be ignored.
|
||||
* @param {number} [options.pages=1] - Number of pages to extract for multi-page input (GIF, WebP, TIFF), use -1 for all pages.
|
||||
* @param {number} [options.page=0] - Page number to start extracting from for multi-page input (GIF, WebP, TIFF), zero based.
|
||||
* @param {number} [options.subifd=-1] - subIFD (Sub Image File Directory) to extract for OME-TIFF, defaults to main image.
|
||||
* @param {number} [options.level=0] - level to extract from a multi-level input (OpenSlide), zero based.
|
||||
* @param {boolean} [options.animated=false] - Set to `true` to read all frames/pages of an animated image (GIF, WebP, TIFF), equivalent of setting `pages` to `-1`.
|
||||
* @param {Object} [options.raw] - describes raw pixel input image data. See `raw()` for pixel ordering.
|
||||
* @param {number} [options.raw.width] - integral number of pixels wide.
|
||||
* @param {number} [options.raw.height] - integral number of pixels high.
|
||||
* @param {number} [options.raw.channels] - integral number of channels, between 1 and 4.
|
||||
* @param {boolean} [options.raw.premultiplied] - specifies that the raw input has already been premultiplied, set to `true`
|
||||
* to avoid sharp premultiplying the image. (optional, default `false`)
|
||||
* @param {Object} [options.create] - describes a new image to be created.
|
||||
* @param {number} [options.create.width] - integral number of pixels wide.
|
||||
* @param {number} [options.create.height] - integral number of pixels high.
|
||||
* @param {number} [options.create.channels] - integral number of channels, either 3 (RGB) or 4 (RGBA).
|
||||
* @param {string|Object} [options.create.background] - parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha.
|
||||
* @param {Object} [options.create.noise] - describes a noise to be created.
|
||||
* @param {string} [options.create.noise.type] - type of generated noise, currently only `gaussian` is supported.
|
||||
* @param {number} [options.create.noise.mean] - mean of pixels in generated noise.
|
||||
* @param {number} [options.create.noise.sigma] - standard deviation of pixels in generated noise.
|
||||
* @param {Object} [options.text] - describes a new text image to be created.
|
||||
* @param {string} [options.text.text] - text to render as a UTF-8 string. It can contain Pango markup, for example `<i>Le</i>Monde`.
|
||||
* @param {string} [options.text.font] - font name to render with.
|
||||
* @param {string} [options.text.fontfile] - absolute filesystem path to a font file that can be used by `font`.
|
||||
* @param {number} [options.text.width=0] - Integral number of pixels to word-wrap at. Lines of text wider than this will be broken at word boundaries.
|
||||
* @param {number} [options.text.height=0] - Maximum integral number of pixels high. When defined, `dpi` will be ignored and the text will automatically fit the pixel resolution defined by `width` and `height`. Will be ignored if `width` is not specified or set to 0.
|
||||
* @param {string} [options.text.align='left'] - Alignment style for multi-line text (`'left'`, `'centre'`, `'center'`, `'right'`).
|
||||
* @param {boolean} [options.text.justify=false] - set this to true to apply justification to the text.
|
||||
* @param {number} [options.text.dpi=72] - the resolution (size) at which to render the text. Does not take effect if `height` is specified.
|
||||
* @param {boolean} [options.text.rgba=false] - set this to true to enable RGBA output. This is useful for colour emoji rendering, or support for pango markup features like `<span foreground="red">Red!</span>`.
|
||||
* @param {number} [options.text.spacing=0] - text line height in points. Will use the font line height if none is specified.
|
||||
* @param {string} [options.text.wrap='word'] - word wrapping style when width is provided, one of: 'word', 'char', 'word-char' (prefer word, fallback to char) or 'none'.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
const Sharp = function (input, options) {
|
||||
if (arguments.length === 1 && !is.defined(input)) {
|
||||
throw new Error('Invalid input');
|
||||
}
|
||||
if (!(this instanceof Sharp)) {
|
||||
return new Sharp(input, options);
|
||||
}
|
||||
stream.Duplex.call(this);
|
||||
this.options = {
|
||||
// resize options
|
||||
topOffsetPre: -1,
|
||||
leftOffsetPre: -1,
|
||||
widthPre: -1,
|
||||
heightPre: -1,
|
||||
topOffsetPost: -1,
|
||||
leftOffsetPost: -1,
|
||||
widthPost: -1,
|
||||
heightPost: -1,
|
||||
width: -1,
|
||||
height: -1,
|
||||
canvas: 'crop',
|
||||
position: 0,
|
||||
resizeBackground: [0, 0, 0, 255],
|
||||
useExifOrientation: false,
|
||||
angle: 0,
|
||||
rotationAngle: 0,
|
||||
rotationBackground: [0, 0, 0, 255],
|
||||
rotateBeforePreExtract: false,
|
||||
flip: false,
|
||||
flop: false,
|
||||
extendTop: 0,
|
||||
extendBottom: 0,
|
||||
extendLeft: 0,
|
||||
extendRight: 0,
|
||||
extendBackground: [0, 0, 0, 255],
|
||||
extendWith: 'background',
|
||||
withoutEnlargement: false,
|
||||
withoutReduction: false,
|
||||
affineMatrix: [],
|
||||
affineBackground: [0, 0, 0, 255],
|
||||
affineIdx: 0,
|
||||
affineIdy: 0,
|
||||
affineOdx: 0,
|
||||
affineOdy: 0,
|
||||
affineInterpolator: this.constructor.interpolators.bilinear,
|
||||
kernel: 'lanczos3',
|
||||
fastShrinkOnLoad: true,
|
||||
// operations
|
||||
tint: [-1, 0, 0, 0],
|
||||
flatten: false,
|
||||
flattenBackground: [0, 0, 0],
|
||||
unflatten: false,
|
||||
negate: false,
|
||||
negateAlpha: true,
|
||||
medianSize: 0,
|
||||
blurSigma: 0,
|
||||
precision: 'integer',
|
||||
minAmpl: 0.2,
|
||||
sharpenSigma: 0,
|
||||
sharpenM1: 1,
|
||||
sharpenM2: 2,
|
||||
sharpenX1: 2,
|
||||
sharpenY2: 10,
|
||||
sharpenY3: 20,
|
||||
threshold: 0,
|
||||
thresholdGrayscale: true,
|
||||
trimBackground: [],
|
||||
trimThreshold: -1,
|
||||
trimLineArt: false,
|
||||
gamma: 0,
|
||||
gammaOut: 0,
|
||||
greyscale: false,
|
||||
normalise: false,
|
||||
normaliseLower: 1,
|
||||
normaliseUpper: 99,
|
||||
claheWidth: 0,
|
||||
claheHeight: 0,
|
||||
claheMaxSlope: 3,
|
||||
brightness: 1,
|
||||
saturation: 1,
|
||||
hue: 0,
|
||||
lightness: 0,
|
||||
booleanBufferIn: null,
|
||||
booleanFileIn: '',
|
||||
joinChannelIn: [],
|
||||
extractChannel: -1,
|
||||
removeAlpha: false,
|
||||
ensureAlpha: -1,
|
||||
colourspace: 'srgb',
|
||||
colourspacePipeline: 'last',
|
||||
composite: [],
|
||||
// output
|
||||
fileOut: '',
|
||||
formatOut: 'input',
|
||||
streamOut: false,
|
||||
keepMetadata: 0,
|
||||
withMetadataOrientation: -1,
|
||||
withMetadataDensity: 0,
|
||||
withIccProfile: '',
|
||||
withExif: {},
|
||||
withExifMerge: true,
|
||||
resolveWithObject: false,
|
||||
// output format
|
||||
jpegQuality: 80,
|
||||
jpegProgressive: false,
|
||||
jpegChromaSubsampling: '4:2:0',
|
||||
jpegTrellisQuantisation: false,
|
||||
jpegOvershootDeringing: false,
|
||||
jpegOptimiseScans: false,
|
||||
jpegOptimiseCoding: true,
|
||||
jpegQuantisationTable: 0,
|
||||
pngProgressive: false,
|
||||
pngCompressionLevel: 6,
|
||||
pngAdaptiveFiltering: false,
|
||||
pngPalette: false,
|
||||
pngQuality: 100,
|
||||
pngEffort: 7,
|
||||
pngBitdepth: 8,
|
||||
pngDither: 1,
|
||||
jp2Quality: 80,
|
||||
jp2TileHeight: 512,
|
||||
jp2TileWidth: 512,
|
||||
jp2Lossless: false,
|
||||
jp2ChromaSubsampling: '4:4:4',
|
||||
webpQuality: 80,
|
||||
webpAlphaQuality: 100,
|
||||
webpLossless: false,
|
||||
webpNearLossless: false,
|
||||
webpSmartSubsample: false,
|
||||
webpPreset: 'default',
|
||||
webpEffort: 4,
|
||||
webpMinSize: false,
|
||||
webpMixed: false,
|
||||
gifBitdepth: 8,
|
||||
gifEffort: 7,
|
||||
gifDither: 1,
|
||||
gifInterFrameMaxError: 0,
|
||||
gifInterPaletteMaxError: 3,
|
||||
gifReuse: true,
|
||||
gifProgressive: false,
|
||||
tiffQuality: 80,
|
||||
tiffCompression: 'jpeg',
|
||||
tiffPredictor: 'horizontal',
|
||||
tiffPyramid: false,
|
||||
tiffMiniswhite: false,
|
||||
tiffBitdepth: 8,
|
||||
tiffTile: false,
|
||||
tiffTileHeight: 256,
|
||||
tiffTileWidth: 256,
|
||||
tiffXres: 1.0,
|
||||
tiffYres: 1.0,
|
||||
tiffResolutionUnit: 'inch',
|
||||
heifQuality: 50,
|
||||
heifLossless: false,
|
||||
heifCompression: 'av1',
|
||||
heifEffort: 4,
|
||||
heifChromaSubsampling: '4:4:4',
|
||||
heifBitdepth: 8,
|
||||
jxlDistance: 1,
|
||||
jxlDecodingTier: 0,
|
||||
jxlEffort: 7,
|
||||
jxlLossless: false,
|
||||
rawDepth: 'uchar',
|
||||
tileSize: 256,
|
||||
tileOverlap: 0,
|
||||
tileContainer: 'fs',
|
||||
tileLayout: 'dz',
|
||||
tileFormat: 'last',
|
||||
tileDepth: 'last',
|
||||
tileAngle: 0,
|
||||
tileSkipBlanks: -1,
|
||||
tileBackground: [255, 255, 255, 255],
|
||||
tileCentre: false,
|
||||
tileId: 'https://example.com/iiif',
|
||||
tileBasename: '',
|
||||
timeoutSeconds: 0,
|
||||
linearA: [],
|
||||
linearB: [],
|
||||
// Function to notify of libvips warnings
|
||||
debuglog: warning => {
|
||||
this.emit('warning', warning);
|
||||
debuglog(warning);
|
||||
},
|
||||
// Function to notify of queue length changes
|
||||
queueListener: function (queueLength) {
|
||||
Sharp.queue.emit('change', queueLength);
|
||||
}
|
||||
};
|
||||
this.options.input = this._createInputDescriptor(input, options, { allowStream: true });
|
||||
return this;
|
||||
};
|
||||
Object.setPrototypeOf(Sharp.prototype, stream.Duplex.prototype);
|
||||
Object.setPrototypeOf(Sharp, stream.Duplex);
|
||||
|
||||
/**
|
||||
* Take a "snapshot" of the Sharp instance, returning a new instance.
|
||||
* Cloned instances inherit the input of their parent instance.
|
||||
* This allows multiple output Streams and therefore multiple processing pipelines to share a single input Stream.
|
||||
*
|
||||
* @example
|
||||
* const pipeline = sharp().rotate();
|
||||
* pipeline.clone().resize(800, 600).pipe(firstWritableStream);
|
||||
* pipeline.clone().extract({ left: 20, top: 20, width: 100, height: 100 }).pipe(secondWritableStream);
|
||||
* readableStream.pipe(pipeline);
|
||||
* // firstWritableStream receives auto-rotated, resized readableStream
|
||||
* // secondWritableStream receives auto-rotated, extracted region of readableStream
|
||||
*
|
||||
* @example
|
||||
* // Create a pipeline that will download an image, resize it and format it to different files
|
||||
* // Using Promises to know when the pipeline is complete
|
||||
* const fs = require("fs");
|
||||
* const got = require("got");
|
||||
* const sharpStream = sharp({ failOn: 'none' });
|
||||
*
|
||||
* const promises = [];
|
||||
*
|
||||
* promises.push(
|
||||
* sharpStream
|
||||
* .clone()
|
||||
* .jpeg({ quality: 100 })
|
||||
* .toFile("originalFile.jpg")
|
||||
* );
|
||||
*
|
||||
* promises.push(
|
||||
* sharpStream
|
||||
* .clone()
|
||||
* .resize({ width: 500 })
|
||||
* .jpeg({ quality: 80 })
|
||||
* .toFile("optimized-500.jpg")
|
||||
* );
|
||||
*
|
||||
* promises.push(
|
||||
* sharpStream
|
||||
* .clone()
|
||||
* .resize({ width: 500 })
|
||||
* .webp({ quality: 80 })
|
||||
* .toFile("optimized-500.webp")
|
||||
* );
|
||||
*
|
||||
* // https://github.com/sindresorhus/got/blob/main/documentation/3-streams.md
|
||||
* got.stream("https://www.example.com/some-file.jpg").pipe(sharpStream);
|
||||
*
|
||||
* Promise.all(promises)
|
||||
* .then(res => { console.log("Done!", res); })
|
||||
* .catch(err => {
|
||||
* console.error("Error processing files, let's clean it up", err);
|
||||
* try {
|
||||
* fs.unlinkSync("originalFile.jpg");
|
||||
* fs.unlinkSync("optimized-500.jpg");
|
||||
* fs.unlinkSync("optimized-500.webp");
|
||||
* } catch (e) {}
|
||||
* });
|
||||
*
|
||||
* @returns {Sharp}
|
||||
*/
|
||||
function clone () {
|
||||
// Clone existing options
|
||||
const clone = this.constructor.call();
|
||||
const { debuglog, queueListener, ...options } = this.options;
|
||||
clone.options = structuredClone(options);
|
||||
clone.options.debuglog = debuglog;
|
||||
clone.options.queueListener = queueListener;
|
||||
// Pass 'finish' event to clone for Stream-based input
|
||||
if (this._isStreamInput()) {
|
||||
this.on('finish', () => {
|
||||
// Clone inherits input data
|
||||
this._flattenBufferIn();
|
||||
clone.options.input.buffer = this.options.input.buffer;
|
||||
clone.emit('finish');
|
||||
});
|
||||
}
|
||||
return clone;
|
||||
}
|
||||
Object.assign(Sharp.prototype, { clone });
|
||||
|
||||
/**
|
||||
* Export constructor.
|
||||
* @private
|
||||
*/
|
||||
module.exports = Sharp;
|
1754
node_modules/sharp/lib/index.d.ts
generated
vendored
Normal file
1754
node_modules/sharp/lib/index.d.ts
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
16
node_modules/sharp/lib/index.js
generated
vendored
Normal file
16
node_modules/sharp/lib/index.js
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
const Sharp = require('./constructor');
|
||||
require('./input')(Sharp);
|
||||
require('./resize')(Sharp);
|
||||
require('./composite')(Sharp);
|
||||
require('./operation')(Sharp);
|
||||
require('./colour')(Sharp);
|
||||
require('./channel')(Sharp);
|
||||
require('./output')(Sharp);
|
||||
require('./utility')(Sharp);
|
||||
|
||||
module.exports = Sharp;
|
658
node_modules/sharp/lib/input.js
generated
vendored
Normal file
658
node_modules/sharp/lib/input.js
generated
vendored
Normal file
@@ -0,0 +1,658 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
const color = require('color');
|
||||
const is = require('./is');
|
||||
const sharp = require('./sharp');
|
||||
|
||||
/**
|
||||
* Justication alignment
|
||||
* @member
|
||||
* @private
|
||||
*/
|
||||
const align = {
|
||||
left: 'low',
|
||||
center: 'centre',
|
||||
centre: 'centre',
|
||||
right: 'high'
|
||||
};
|
||||
|
||||
/**
|
||||
* Extract input options, if any, from an object.
|
||||
* @private
|
||||
*/
|
||||
function _inputOptionsFromObject (obj) {
|
||||
const { raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd } = obj;
|
||||
return [raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd].some(is.defined)
|
||||
? { raw, density, limitInputPixels, ignoreIcc, unlimited, sequentialRead, failOn, failOnError, animated, page, pages, subifd }
|
||||
: undefined;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create Object containing input and input-related options.
|
||||
* @private
|
||||
*/
|
||||
function _createInputDescriptor (input, inputOptions, containerOptions) {
|
||||
const inputDescriptor = {
|
||||
failOn: 'warning',
|
||||
limitInputPixels: Math.pow(0x3FFF, 2),
|
||||
ignoreIcc: false,
|
||||
unlimited: false,
|
||||
sequentialRead: true
|
||||
};
|
||||
if (is.string(input)) {
|
||||
// filesystem
|
||||
inputDescriptor.file = input;
|
||||
} else if (is.buffer(input)) {
|
||||
// Buffer
|
||||
if (input.length === 0) {
|
||||
throw Error('Input Buffer is empty');
|
||||
}
|
||||
inputDescriptor.buffer = input;
|
||||
} else if (is.arrayBuffer(input)) {
|
||||
if (input.byteLength === 0) {
|
||||
throw Error('Input bit Array is empty');
|
||||
}
|
||||
inputDescriptor.buffer = Buffer.from(input, 0, input.byteLength);
|
||||
} else if (is.typedArray(input)) {
|
||||
if (input.length === 0) {
|
||||
throw Error('Input Bit Array is empty');
|
||||
}
|
||||
inputDescriptor.buffer = Buffer.from(input.buffer, input.byteOffset, input.byteLength);
|
||||
} else if (is.plainObject(input) && !is.defined(inputOptions)) {
|
||||
// Plain Object descriptor, e.g. create
|
||||
inputOptions = input;
|
||||
if (_inputOptionsFromObject(inputOptions)) {
|
||||
// Stream with options
|
||||
inputDescriptor.buffer = [];
|
||||
}
|
||||
} else if (!is.defined(input) && !is.defined(inputOptions) && is.object(containerOptions) && containerOptions.allowStream) {
|
||||
// Stream without options
|
||||
inputDescriptor.buffer = [];
|
||||
} else {
|
||||
throw new Error(`Unsupported input '${input}' of type ${typeof input}${
|
||||
is.defined(inputOptions) ? ` when also providing options of type ${typeof inputOptions}` : ''
|
||||
}`);
|
||||
}
|
||||
if (is.object(inputOptions)) {
|
||||
// Deprecated: failOnError
|
||||
if (is.defined(inputOptions.failOnError)) {
|
||||
if (is.bool(inputOptions.failOnError)) {
|
||||
inputDescriptor.failOn = inputOptions.failOnError ? 'warning' : 'none';
|
||||
} else {
|
||||
throw is.invalidParameterError('failOnError', 'boolean', inputOptions.failOnError);
|
||||
}
|
||||
}
|
||||
// failOn
|
||||
if (is.defined(inputOptions.failOn)) {
|
||||
if (is.string(inputOptions.failOn) && is.inArray(inputOptions.failOn, ['none', 'truncated', 'error', 'warning'])) {
|
||||
inputDescriptor.failOn = inputOptions.failOn;
|
||||
} else {
|
||||
throw is.invalidParameterError('failOn', 'one of: none, truncated, error, warning', inputOptions.failOn);
|
||||
}
|
||||
}
|
||||
// Density
|
||||
if (is.defined(inputOptions.density)) {
|
||||
if (is.inRange(inputOptions.density, 1, 100000)) {
|
||||
inputDescriptor.density = inputOptions.density;
|
||||
} else {
|
||||
throw is.invalidParameterError('density', 'number between 1 and 100000', inputOptions.density);
|
||||
}
|
||||
}
|
||||
// Ignore embeddded ICC profile
|
||||
if (is.defined(inputOptions.ignoreIcc)) {
|
||||
if (is.bool(inputOptions.ignoreIcc)) {
|
||||
inputDescriptor.ignoreIcc = inputOptions.ignoreIcc;
|
||||
} else {
|
||||
throw is.invalidParameterError('ignoreIcc', 'boolean', inputOptions.ignoreIcc);
|
||||
}
|
||||
}
|
||||
// limitInputPixels
|
||||
if (is.defined(inputOptions.limitInputPixels)) {
|
||||
if (is.bool(inputOptions.limitInputPixels)) {
|
||||
inputDescriptor.limitInputPixels = inputOptions.limitInputPixels
|
||||
? Math.pow(0x3FFF, 2)
|
||||
: 0;
|
||||
} else if (is.integer(inputOptions.limitInputPixels) && is.inRange(inputOptions.limitInputPixels, 0, Number.MAX_SAFE_INTEGER)) {
|
||||
inputDescriptor.limitInputPixels = inputOptions.limitInputPixels;
|
||||
} else {
|
||||
throw is.invalidParameterError('limitInputPixels', 'positive integer', inputOptions.limitInputPixels);
|
||||
}
|
||||
}
|
||||
// unlimited
|
||||
if (is.defined(inputOptions.unlimited)) {
|
||||
if (is.bool(inputOptions.unlimited)) {
|
||||
inputDescriptor.unlimited = inputOptions.unlimited;
|
||||
} else {
|
||||
throw is.invalidParameterError('unlimited', 'boolean', inputOptions.unlimited);
|
||||
}
|
||||
}
|
||||
// sequentialRead
|
||||
if (is.defined(inputOptions.sequentialRead)) {
|
||||
if (is.bool(inputOptions.sequentialRead)) {
|
||||
inputDescriptor.sequentialRead = inputOptions.sequentialRead;
|
||||
} else {
|
||||
throw is.invalidParameterError('sequentialRead', 'boolean', inputOptions.sequentialRead);
|
||||
}
|
||||
}
|
||||
// Raw pixel input
|
||||
if (is.defined(inputOptions.raw)) {
|
||||
if (
|
||||
is.object(inputOptions.raw) &&
|
||||
is.integer(inputOptions.raw.width) && inputOptions.raw.width > 0 &&
|
||||
is.integer(inputOptions.raw.height) && inputOptions.raw.height > 0 &&
|
||||
is.integer(inputOptions.raw.channels) && is.inRange(inputOptions.raw.channels, 1, 4)
|
||||
) {
|
||||
inputDescriptor.rawWidth = inputOptions.raw.width;
|
||||
inputDescriptor.rawHeight = inputOptions.raw.height;
|
||||
inputDescriptor.rawChannels = inputOptions.raw.channels;
|
||||
inputDescriptor.rawPremultiplied = !!inputOptions.raw.premultiplied;
|
||||
|
||||
switch (input.constructor) {
|
||||
case Uint8Array:
|
||||
case Uint8ClampedArray:
|
||||
inputDescriptor.rawDepth = 'uchar';
|
||||
break;
|
||||
case Int8Array:
|
||||
inputDescriptor.rawDepth = 'char';
|
||||
break;
|
||||
case Uint16Array:
|
||||
inputDescriptor.rawDepth = 'ushort';
|
||||
break;
|
||||
case Int16Array:
|
||||
inputDescriptor.rawDepth = 'short';
|
||||
break;
|
||||
case Uint32Array:
|
||||
inputDescriptor.rawDepth = 'uint';
|
||||
break;
|
||||
case Int32Array:
|
||||
inputDescriptor.rawDepth = 'int';
|
||||
break;
|
||||
case Float32Array:
|
||||
inputDescriptor.rawDepth = 'float';
|
||||
break;
|
||||
case Float64Array:
|
||||
inputDescriptor.rawDepth = 'double';
|
||||
break;
|
||||
default:
|
||||
inputDescriptor.rawDepth = 'uchar';
|
||||
break;
|
||||
}
|
||||
} else {
|
||||
throw new Error('Expected width, height and channels for raw pixel input');
|
||||
}
|
||||
}
|
||||
// Multi-page input (GIF, TIFF, PDF)
|
||||
if (is.defined(inputOptions.animated)) {
|
||||
if (is.bool(inputOptions.animated)) {
|
||||
inputDescriptor.pages = inputOptions.animated ? -1 : 1;
|
||||
} else {
|
||||
throw is.invalidParameterError('animated', 'boolean', inputOptions.animated);
|
||||
}
|
||||
}
|
||||
if (is.defined(inputOptions.pages)) {
|
||||
if (is.integer(inputOptions.pages) && is.inRange(inputOptions.pages, -1, 100000)) {
|
||||
inputDescriptor.pages = inputOptions.pages;
|
||||
} else {
|
||||
throw is.invalidParameterError('pages', 'integer between -1 and 100000', inputOptions.pages);
|
||||
}
|
||||
}
|
||||
if (is.defined(inputOptions.page)) {
|
||||
if (is.integer(inputOptions.page) && is.inRange(inputOptions.page, 0, 100000)) {
|
||||
inputDescriptor.page = inputOptions.page;
|
||||
} else {
|
||||
throw is.invalidParameterError('page', 'integer between 0 and 100000', inputOptions.page);
|
||||
}
|
||||
}
|
||||
// Multi-level input (OpenSlide)
|
||||
if (is.defined(inputOptions.level)) {
|
||||
if (is.integer(inputOptions.level) && is.inRange(inputOptions.level, 0, 256)) {
|
||||
inputDescriptor.level = inputOptions.level;
|
||||
} else {
|
||||
throw is.invalidParameterError('level', 'integer between 0 and 256', inputOptions.level);
|
||||
}
|
||||
}
|
||||
// Sub Image File Directory (TIFF)
|
||||
if (is.defined(inputOptions.subifd)) {
|
||||
if (is.integer(inputOptions.subifd) && is.inRange(inputOptions.subifd, -1, 100000)) {
|
||||
inputDescriptor.subifd = inputOptions.subifd;
|
||||
} else {
|
||||
throw is.invalidParameterError('subifd', 'integer between -1 and 100000', inputOptions.subifd);
|
||||
}
|
||||
}
|
||||
// Create new image
|
||||
if (is.defined(inputOptions.create)) {
|
||||
if (
|
||||
is.object(inputOptions.create) &&
|
||||
is.integer(inputOptions.create.width) && inputOptions.create.width > 0 &&
|
||||
is.integer(inputOptions.create.height) && inputOptions.create.height > 0 &&
|
||||
is.integer(inputOptions.create.channels)
|
||||
) {
|
||||
inputDescriptor.createWidth = inputOptions.create.width;
|
||||
inputDescriptor.createHeight = inputOptions.create.height;
|
||||
inputDescriptor.createChannels = inputOptions.create.channels;
|
||||
// Noise
|
||||
if (is.defined(inputOptions.create.noise)) {
|
||||
if (!is.object(inputOptions.create.noise)) {
|
||||
throw new Error('Expected noise to be an object');
|
||||
}
|
||||
if (!is.inArray(inputOptions.create.noise.type, ['gaussian'])) {
|
||||
throw new Error('Only gaussian noise is supported at the moment');
|
||||
}
|
||||
if (!is.inRange(inputOptions.create.channels, 1, 4)) {
|
||||
throw is.invalidParameterError('create.channels', 'number between 1 and 4', inputOptions.create.channels);
|
||||
}
|
||||
inputDescriptor.createNoiseType = inputOptions.create.noise.type;
|
||||
if (is.number(inputOptions.create.noise.mean) && is.inRange(inputOptions.create.noise.mean, 0, 10000)) {
|
||||
inputDescriptor.createNoiseMean = inputOptions.create.noise.mean;
|
||||
} else {
|
||||
throw is.invalidParameterError('create.noise.mean', 'number between 0 and 10000', inputOptions.create.noise.mean);
|
||||
}
|
||||
if (is.number(inputOptions.create.noise.sigma) && is.inRange(inputOptions.create.noise.sigma, 0, 10000)) {
|
||||
inputDescriptor.createNoiseSigma = inputOptions.create.noise.sigma;
|
||||
} else {
|
||||
throw is.invalidParameterError('create.noise.sigma', 'number between 0 and 10000', inputOptions.create.noise.sigma);
|
||||
}
|
||||
} else if (is.defined(inputOptions.create.background)) {
|
||||
if (!is.inRange(inputOptions.create.channels, 3, 4)) {
|
||||
throw is.invalidParameterError('create.channels', 'number between 3 and 4', inputOptions.create.channels);
|
||||
}
|
||||
const background = color(inputOptions.create.background);
|
||||
inputDescriptor.createBackground = [
|
||||
background.red(),
|
||||
background.green(),
|
||||
background.blue(),
|
||||
Math.round(background.alpha() * 255)
|
||||
];
|
||||
} else {
|
||||
throw new Error('Expected valid noise or background to create a new input image');
|
||||
}
|
||||
delete inputDescriptor.buffer;
|
||||
} else {
|
||||
throw new Error('Expected valid width, height and channels to create a new input image');
|
||||
}
|
||||
}
|
||||
// Create a new image with text
|
||||
if (is.defined(inputOptions.text)) {
|
||||
if (is.object(inputOptions.text) && is.string(inputOptions.text.text)) {
|
||||
inputDescriptor.textValue = inputOptions.text.text;
|
||||
if (is.defined(inputOptions.text.height) && is.defined(inputOptions.text.dpi)) {
|
||||
throw new Error('Expected only one of dpi or height');
|
||||
}
|
||||
if (is.defined(inputOptions.text.font)) {
|
||||
if (is.string(inputOptions.text.font)) {
|
||||
inputDescriptor.textFont = inputOptions.text.font;
|
||||
} else {
|
||||
throw is.invalidParameterError('text.font', 'string', inputOptions.text.font);
|
||||
}
|
||||
}
|
||||
if (is.defined(inputOptions.text.fontfile)) {
|
||||
if (is.string(inputOptions.text.fontfile)) {
|
||||
inputDescriptor.textFontfile = inputOptions.text.fontfile;
|
||||
} else {
|
||||
throw is.invalidParameterError('text.fontfile', 'string', inputOptions.text.fontfile);
|
||||
}
|
||||
}
|
||||
if (is.defined(inputOptions.text.width)) {
|
||||
if (is.integer(inputOptions.text.width) && inputOptions.text.width > 0) {
|
||||
inputDescriptor.textWidth = inputOptions.text.width;
|
||||
} else {
|
||||
throw is.invalidParameterError('text.width', 'positive integer', inputOptions.text.width);
|
||||
}
|
||||
}
|
||||
if (is.defined(inputOptions.text.height)) {
|
||||
if (is.integer(inputOptions.text.height) && inputOptions.text.height > 0) {
|
||||
inputDescriptor.textHeight = inputOptions.text.height;
|
||||
} else {
|
||||
throw is.invalidParameterError('text.height', 'positive integer', inputOptions.text.height);
|
||||
}
|
||||
}
|
||||
if (is.defined(inputOptions.text.align)) {
|
||||
if (is.string(inputOptions.text.align) && is.string(this.constructor.align[inputOptions.text.align])) {
|
||||
inputDescriptor.textAlign = this.constructor.align[inputOptions.text.align];
|
||||
} else {
|
||||
throw is.invalidParameterError('text.align', 'valid alignment', inputOptions.text.align);
|
||||
}
|
||||
}
|
||||
if (is.defined(inputOptions.text.justify)) {
|
||||
if (is.bool(inputOptions.text.justify)) {
|
||||
inputDescriptor.textJustify = inputOptions.text.justify;
|
||||
} else {
|
||||
throw is.invalidParameterError('text.justify', 'boolean', inputOptions.text.justify);
|
||||
}
|
||||
}
|
||||
if (is.defined(inputOptions.text.dpi)) {
|
||||
if (is.integer(inputOptions.text.dpi) && is.inRange(inputOptions.text.dpi, 1, 1000000)) {
|
||||
inputDescriptor.textDpi = inputOptions.text.dpi;
|
||||
} else {
|
||||
throw is.invalidParameterError('text.dpi', 'integer between 1 and 1000000', inputOptions.text.dpi);
|
||||
}
|
||||
}
|
||||
if (is.defined(inputOptions.text.rgba)) {
|
||||
if (is.bool(inputOptions.text.rgba)) {
|
||||
inputDescriptor.textRgba = inputOptions.text.rgba;
|
||||
} else {
|
||||
throw is.invalidParameterError('text.rgba', 'bool', inputOptions.text.rgba);
|
||||
}
|
||||
}
|
||||
if (is.defined(inputOptions.text.spacing)) {
|
||||
if (is.integer(inputOptions.text.spacing) && is.inRange(inputOptions.text.spacing, -1000000, 1000000)) {
|
||||
inputDescriptor.textSpacing = inputOptions.text.spacing;
|
||||
} else {
|
||||
throw is.invalidParameterError('text.spacing', 'integer between -1000000 and 1000000', inputOptions.text.spacing);
|
||||
}
|
||||
}
|
||||
if (is.defined(inputOptions.text.wrap)) {
|
||||
if (is.string(inputOptions.text.wrap) && is.inArray(inputOptions.text.wrap, ['word', 'char', 'word-char', 'none'])) {
|
||||
inputDescriptor.textWrap = inputOptions.text.wrap;
|
||||
} else {
|
||||
throw is.invalidParameterError('text.wrap', 'one of: word, char, word-char, none', inputOptions.text.wrap);
|
||||
}
|
||||
}
|
||||
delete inputDescriptor.buffer;
|
||||
} else {
|
||||
throw new Error('Expected a valid string to create an image with text.');
|
||||
}
|
||||
}
|
||||
} else if (is.defined(inputOptions)) {
|
||||
throw new Error('Invalid input options ' + inputOptions);
|
||||
}
|
||||
return inputDescriptor;
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle incoming Buffer chunk on Writable Stream.
|
||||
* @private
|
||||
* @param {Buffer} chunk
|
||||
* @param {string} encoding - unused
|
||||
* @param {Function} callback
|
||||
*/
|
||||
function _write (chunk, encoding, callback) {
|
||||
/* istanbul ignore else */
|
||||
if (Array.isArray(this.options.input.buffer)) {
|
||||
/* istanbul ignore else */
|
||||
if (is.buffer(chunk)) {
|
||||
if (this.options.input.buffer.length === 0) {
|
||||
this.on('finish', () => {
|
||||
this.streamInFinished = true;
|
||||
});
|
||||
}
|
||||
this.options.input.buffer.push(chunk);
|
||||
callback();
|
||||
} else {
|
||||
callback(new Error('Non-Buffer data on Writable Stream'));
|
||||
}
|
||||
} else {
|
||||
callback(new Error('Unexpected data on Writable Stream'));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Flattens the array of chunks accumulated in input.buffer.
|
||||
* @private
|
||||
*/
|
||||
function _flattenBufferIn () {
|
||||
if (this._isStreamInput()) {
|
||||
this.options.input.buffer = Buffer.concat(this.options.input.buffer);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Are we expecting Stream-based input?
|
||||
* @private
|
||||
* @returns {boolean}
|
||||
*/
|
||||
function _isStreamInput () {
|
||||
return Array.isArray(this.options.input.buffer);
|
||||
}
|
||||
|
||||
/**
|
||||
* Fast access to (uncached) image metadata without decoding any compressed pixel data.
|
||||
*
|
||||
* This is read from the header of the input image.
|
||||
* It does not take into consideration any operations to be applied to the output image,
|
||||
* such as resize or rotate.
|
||||
*
|
||||
* Dimensions in the response will respect the `page` and `pages` properties of the
|
||||
* {@link /api-constructor#parameters|constructor parameters}.
|
||||
*
|
||||
* A `Promise` is returned when `callback` is not provided.
|
||||
*
|
||||
* - `format`: Name of decoder used to decompress image data e.g. `jpeg`, `png`, `webp`, `gif`, `svg`
|
||||
* - `size`: Total size of image in bytes, for Stream and Buffer input only
|
||||
* - `width`: Number of pixels wide (EXIF orientation is not taken into consideration, see example below)
|
||||
* - `height`: Number of pixels high (EXIF orientation is not taken into consideration, see example below)
|
||||
* - `space`: Name of colour space interpretation e.g. `srgb`, `rgb`, `cmyk`, `lab`, `b-w` [...](https://www.libvips.org/API/current/VipsImage.html#VipsInterpretation)
|
||||
* - `channels`: Number of bands e.g. `3` for sRGB, `4` for CMYK
|
||||
* - `depth`: Name of pixel depth format e.g. `uchar`, `char`, `ushort`, `float` [...](https://www.libvips.org/API/current/VipsImage.html#VipsBandFormat)
|
||||
* - `density`: Number of pixels per inch (DPI), if present
|
||||
* - `chromaSubsampling`: String containing JPEG chroma subsampling, `4:2:0` or `4:4:4` for RGB, `4:2:0:4` or `4:4:4:4` for CMYK
|
||||
* - `isProgressive`: Boolean indicating whether the image is interlaced using a progressive scan
|
||||
* - `pages`: Number of pages/frames contained within the image, with support for TIFF, HEIF, PDF, animated GIF and animated WebP
|
||||
* - `pageHeight`: Number of pixels high each page in a multi-page image will be.
|
||||
* - `paletteBitDepth`: Bit depth of palette-based image (GIF, PNG).
|
||||
* - `loop`: Number of times to loop an animated image, zero refers to a continuous loop.
|
||||
* - `delay`: Delay in ms between each page in an animated image, provided as an array of integers.
|
||||
* - `pagePrimary`: Number of the primary page in a HEIF image
|
||||
* - `levels`: Details of each level in a multi-level image provided as an array of objects, requires libvips compiled with support for OpenSlide
|
||||
* - `subifds`: Number of Sub Image File Directories in an OME-TIFF image
|
||||
* - `background`: Default background colour, if present, for PNG (bKGD) and GIF images, either an RGB Object or a single greyscale value
|
||||
* - `compression`: The encoder used to compress an HEIF file, `av1` (AVIF) or `hevc` (HEIC)
|
||||
* - `resolutionUnit`: The unit of resolution (density), either `inch` or `cm`, if present
|
||||
* - `hasProfile`: Boolean indicating the presence of an embedded ICC profile
|
||||
* - `hasAlpha`: Boolean indicating the presence of an alpha transparency channel
|
||||
* - `orientation`: Number value of the EXIF Orientation header, if present
|
||||
* - `exif`: Buffer containing raw EXIF data, if present
|
||||
* - `icc`: Buffer containing raw [ICC](https://www.npmjs.com/package/icc) profile data, if present
|
||||
* - `iptc`: Buffer containing raw IPTC data, if present
|
||||
* - `xmp`: Buffer containing raw XMP data, if present
|
||||
* - `tifftagPhotoshop`: Buffer containing raw TIFFTAG_PHOTOSHOP data, if present
|
||||
* - `formatMagick`: String containing format for images loaded via *magick
|
||||
* - `comments`: Array of keyword/text pairs representing PNG text blocks, if present.
|
||||
*
|
||||
* @example
|
||||
* const metadata = await sharp(input).metadata();
|
||||
*
|
||||
* @example
|
||||
* const image = sharp(inputJpg);
|
||||
* image
|
||||
* .metadata()
|
||||
* .then(function(metadata) {
|
||||
* return image
|
||||
* .resize(Math.round(metadata.width / 2))
|
||||
* .webp()
|
||||
* .toBuffer();
|
||||
* })
|
||||
* .then(function(data) {
|
||||
* // data contains a WebP image half the width and height of the original JPEG
|
||||
* });
|
||||
*
|
||||
* @example
|
||||
* // Based on EXIF rotation metadata, get the right-side-up width and height:
|
||||
*
|
||||
* const size = getNormalSize(await sharp(input).metadata());
|
||||
*
|
||||
* function getNormalSize({ width, height, orientation }) {
|
||||
* return (orientation || 0) >= 5
|
||||
* ? { width: height, height: width }
|
||||
* : { width, height };
|
||||
* }
|
||||
*
|
||||
* @param {Function} [callback] - called with the arguments `(err, metadata)`
|
||||
* @returns {Promise<Object>|Sharp}
|
||||
*/
|
||||
function metadata (callback) {
|
||||
const stack = Error();
|
||||
if (is.fn(callback)) {
|
||||
if (this._isStreamInput()) {
|
||||
this.on('finish', () => {
|
||||
this._flattenBufferIn();
|
||||
sharp.metadata(this.options, (err, metadata) => {
|
||||
if (err) {
|
||||
callback(is.nativeError(err, stack));
|
||||
} else {
|
||||
callback(null, metadata);
|
||||
}
|
||||
});
|
||||
});
|
||||
} else {
|
||||
sharp.metadata(this.options, (err, metadata) => {
|
||||
if (err) {
|
||||
callback(is.nativeError(err, stack));
|
||||
} else {
|
||||
callback(null, metadata);
|
||||
}
|
||||
});
|
||||
}
|
||||
return this;
|
||||
} else {
|
||||
if (this._isStreamInput()) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const finished = () => {
|
||||
this._flattenBufferIn();
|
||||
sharp.metadata(this.options, (err, metadata) => {
|
||||
if (err) {
|
||||
reject(is.nativeError(err, stack));
|
||||
} else {
|
||||
resolve(metadata);
|
||||
}
|
||||
});
|
||||
};
|
||||
if (this.writableFinished) {
|
||||
finished();
|
||||
} else {
|
||||
this.once('finish', finished);
|
||||
}
|
||||
});
|
||||
} else {
|
||||
return new Promise((resolve, reject) => {
|
||||
sharp.metadata(this.options, (err, metadata) => {
|
||||
if (err) {
|
||||
reject(is.nativeError(err, stack));
|
||||
} else {
|
||||
resolve(metadata);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Access to pixel-derived image statistics for every channel in the image.
|
||||
* A `Promise` is returned when `callback` is not provided.
|
||||
*
|
||||
* - `channels`: Array of channel statistics for each channel in the image. Each channel statistic contains
|
||||
* - `min` (minimum value in the channel)
|
||||
* - `max` (maximum value in the channel)
|
||||
* - `sum` (sum of all values in a channel)
|
||||
* - `squaresSum` (sum of squared values in a channel)
|
||||
* - `mean` (mean of the values in a channel)
|
||||
* - `stdev` (standard deviation for the values in a channel)
|
||||
* - `minX` (x-coordinate of one of the pixel where the minimum lies)
|
||||
* - `minY` (y-coordinate of one of the pixel where the minimum lies)
|
||||
* - `maxX` (x-coordinate of one of the pixel where the maximum lies)
|
||||
* - `maxY` (y-coordinate of one of the pixel where the maximum lies)
|
||||
* - `isOpaque`: Is the image fully opaque? Will be `true` if the image has no alpha channel or if every pixel is fully opaque.
|
||||
* - `entropy`: Histogram-based estimation of greyscale entropy, discarding alpha channel if any.
|
||||
* - `sharpness`: Estimation of greyscale sharpness based on the standard deviation of a Laplacian convolution, discarding alpha channel if any.
|
||||
* - `dominant`: Object containing most dominant sRGB colour based on a 4096-bin 3D histogram.
|
||||
*
|
||||
* **Note**: Statistics are derived from the original input image. Any operations performed on the image must first be
|
||||
* written to a buffer in order to run `stats` on the result (see third example).
|
||||
*
|
||||
* @example
|
||||
* const image = sharp(inputJpg);
|
||||
* image
|
||||
* .stats()
|
||||
* .then(function(stats) {
|
||||
* // stats contains the channel-wise statistics array and the isOpaque value
|
||||
* });
|
||||
*
|
||||
* @example
|
||||
* const { entropy, sharpness, dominant } = await sharp(input).stats();
|
||||
* const { r, g, b } = dominant;
|
||||
*
|
||||
* @example
|
||||
* const image = sharp(input);
|
||||
* // store intermediate result
|
||||
* const part = await image.extract(region).toBuffer();
|
||||
* // create new instance to obtain statistics of extracted region
|
||||
* const stats = await sharp(part).stats();
|
||||
*
|
||||
* @param {Function} [callback] - called with the arguments `(err, stats)`
|
||||
* @returns {Promise<Object>}
|
||||
*/
|
||||
function stats (callback) {
|
||||
const stack = Error();
|
||||
if (is.fn(callback)) {
|
||||
if (this._isStreamInput()) {
|
||||
this.on('finish', () => {
|
||||
this._flattenBufferIn();
|
||||
sharp.stats(this.options, (err, stats) => {
|
||||
if (err) {
|
||||
callback(is.nativeError(err, stack));
|
||||
} else {
|
||||
callback(null, stats);
|
||||
}
|
||||
});
|
||||
});
|
||||
} else {
|
||||
sharp.stats(this.options, (err, stats) => {
|
||||
if (err) {
|
||||
callback(is.nativeError(err, stack));
|
||||
} else {
|
||||
callback(null, stats);
|
||||
}
|
||||
});
|
||||
}
|
||||
return this;
|
||||
} else {
|
||||
if (this._isStreamInput()) {
|
||||
return new Promise((resolve, reject) => {
|
||||
this.on('finish', function () {
|
||||
this._flattenBufferIn();
|
||||
sharp.stats(this.options, (err, stats) => {
|
||||
if (err) {
|
||||
reject(is.nativeError(err, stack));
|
||||
} else {
|
||||
resolve(stats);
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
} else {
|
||||
return new Promise((resolve, reject) => {
|
||||
sharp.stats(this.options, (err, stats) => {
|
||||
if (err) {
|
||||
reject(is.nativeError(err, stack));
|
||||
} else {
|
||||
resolve(stats);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Decorate the Sharp prototype with input-related functions.
|
||||
* @private
|
||||
*/
|
||||
module.exports = function (Sharp) {
|
||||
Object.assign(Sharp.prototype, {
|
||||
// Private
|
||||
_inputOptionsFromObject,
|
||||
_createInputDescriptor,
|
||||
_write,
|
||||
_flattenBufferIn,
|
||||
_isStreamInput,
|
||||
// Public
|
||||
metadata,
|
||||
stats
|
||||
});
|
||||
// Class attributes
|
||||
Sharp.align = align;
|
||||
};
|
169
node_modules/sharp/lib/is.js
generated
vendored
Normal file
169
node_modules/sharp/lib/is.js
generated
vendored
Normal file
@@ -0,0 +1,169 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* Is this value defined and not null?
|
||||
* @private
|
||||
*/
|
||||
const defined = function (val) {
|
||||
return typeof val !== 'undefined' && val !== null;
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value an object?
|
||||
* @private
|
||||
*/
|
||||
const object = function (val) {
|
||||
return typeof val === 'object';
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value a plain object?
|
||||
* @private
|
||||
*/
|
||||
const plainObject = function (val) {
|
||||
return Object.prototype.toString.call(val) === '[object Object]';
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value a function?
|
||||
* @private
|
||||
*/
|
||||
const fn = function (val) {
|
||||
return typeof val === 'function';
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value a boolean?
|
||||
* @private
|
||||
*/
|
||||
const bool = function (val) {
|
||||
return typeof val === 'boolean';
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value a Buffer object?
|
||||
* @private
|
||||
*/
|
||||
const buffer = function (val) {
|
||||
return val instanceof Buffer;
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value a typed array object?. E.g. Uint8Array or Uint8ClampedArray?
|
||||
* @private
|
||||
*/
|
||||
const typedArray = function (val) {
|
||||
if (defined(val)) {
|
||||
switch (val.constructor) {
|
||||
case Uint8Array:
|
||||
case Uint8ClampedArray:
|
||||
case Int8Array:
|
||||
case Uint16Array:
|
||||
case Int16Array:
|
||||
case Uint32Array:
|
||||
case Int32Array:
|
||||
case Float32Array:
|
||||
case Float64Array:
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
return false;
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value an ArrayBuffer object?
|
||||
* @private
|
||||
*/
|
||||
const arrayBuffer = function (val) {
|
||||
return val instanceof ArrayBuffer;
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value a non-empty string?
|
||||
* @private
|
||||
*/
|
||||
const string = function (val) {
|
||||
return typeof val === 'string' && val.length > 0;
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value a real number?
|
||||
* @private
|
||||
*/
|
||||
const number = function (val) {
|
||||
return typeof val === 'number' && !Number.isNaN(val);
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value an integer?
|
||||
* @private
|
||||
*/
|
||||
const integer = function (val) {
|
||||
return Number.isInteger(val);
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value within an inclusive given range?
|
||||
* @private
|
||||
*/
|
||||
const inRange = function (val, min, max) {
|
||||
return val >= min && val <= max;
|
||||
};
|
||||
|
||||
/**
|
||||
* Is this value within the elements of an array?
|
||||
* @private
|
||||
*/
|
||||
const inArray = function (val, list) {
|
||||
return list.includes(val);
|
||||
};
|
||||
|
||||
/**
|
||||
* Create an Error with a message relating to an invalid parameter.
|
||||
*
|
||||
* @param {string} name - parameter name.
|
||||
* @param {string} expected - description of the type/value/range expected.
|
||||
* @param {*} actual - the value received.
|
||||
* @returns {Error} Containing the formatted message.
|
||||
* @private
|
||||
*/
|
||||
const invalidParameterError = function (name, expected, actual) {
|
||||
return new Error(
|
||||
`Expected ${expected} for ${name} but received ${actual} of type ${typeof actual}`
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Ensures an Error from C++ contains a JS stack.
|
||||
*
|
||||
* @param {Error} native - Error with message from C++.
|
||||
* @param {Error} context - Error with stack from JS.
|
||||
* @returns {Error} Error with message and stack.
|
||||
* @private
|
||||
*/
|
||||
const nativeError = function (native, context) {
|
||||
context.message = native.message;
|
||||
return context;
|
||||
};
|
||||
|
||||
module.exports = {
|
||||
defined,
|
||||
object,
|
||||
plainObject,
|
||||
fn,
|
||||
bool,
|
||||
buffer,
|
||||
typedArray,
|
||||
arrayBuffer,
|
||||
string,
|
||||
number,
|
||||
integer,
|
||||
inRange,
|
||||
inArray,
|
||||
invalidParameterError,
|
||||
nativeError
|
||||
};
|
203
node_modules/sharp/lib/libvips.js
generated
vendored
Normal file
203
node_modules/sharp/lib/libvips.js
generated
vendored
Normal file
@@ -0,0 +1,203 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
const { spawnSync } = require('node:child_process');
|
||||
const { createHash } = require('node:crypto');
|
||||
const semverCoerce = require('semver/functions/coerce');
|
||||
const semverGreaterThanOrEqualTo = require('semver/functions/gte');
|
||||
const semverSatisfies = require('semver/functions/satisfies');
|
||||
const detectLibc = require('detect-libc');
|
||||
|
||||
const { config, engines, optionalDependencies } = require('../package.json');
|
||||
|
||||
const minimumLibvipsVersionLabelled = process.env.npm_package_config_libvips || /* istanbul ignore next */
|
||||
config.libvips;
|
||||
const minimumLibvipsVersion = semverCoerce(minimumLibvipsVersionLabelled).version;
|
||||
|
||||
const prebuiltPlatforms = [
|
||||
'darwin-arm64', 'darwin-x64',
|
||||
'linux-arm', 'linux-arm64', 'linux-s390x', 'linux-x64',
|
||||
'linuxmusl-arm64', 'linuxmusl-x64',
|
||||
'win32-ia32', 'win32-x64'
|
||||
];
|
||||
|
||||
const spawnSyncOptions = {
|
||||
encoding: 'utf8',
|
||||
shell: true
|
||||
};
|
||||
|
||||
const log = (item) => {
|
||||
if (item instanceof Error) {
|
||||
console.error(`sharp: Installation error: ${item.message}`);
|
||||
} else {
|
||||
console.log(`sharp: ${item}`);
|
||||
}
|
||||
};
|
||||
|
||||
/* istanbul ignore next */
|
||||
const runtimeLibc = () => detectLibc.isNonGlibcLinuxSync() ? detectLibc.familySync() : '';
|
||||
|
||||
const runtimePlatformArch = () => `${process.platform}${runtimeLibc()}-${process.arch}`;
|
||||
|
||||
/* istanbul ignore next */
|
||||
const buildPlatformArch = () => {
|
||||
if (isEmscripten()) {
|
||||
return 'wasm32';
|
||||
}
|
||||
/* eslint camelcase: ["error", { allow: ["^npm_config_"] }] */
|
||||
const { npm_config_arch, npm_config_platform, npm_config_libc } = process.env;
|
||||
const libc = typeof npm_config_libc === 'string' ? npm_config_libc : runtimeLibc();
|
||||
return `${npm_config_platform || process.platform}${libc}-${npm_config_arch || process.arch}`;
|
||||
};
|
||||
|
||||
const buildSharpLibvipsIncludeDir = () => {
|
||||
try {
|
||||
return require(`@img/sharp-libvips-dev-${buildPlatformArch()}/include`);
|
||||
} catch {
|
||||
try {
|
||||
return require('@img/sharp-libvips-dev/include');
|
||||
} catch {}
|
||||
}
|
||||
/* istanbul ignore next */
|
||||
return '';
|
||||
};
|
||||
|
||||
const buildSharpLibvipsCPlusPlusDir = () => {
|
||||
try {
|
||||
return require('@img/sharp-libvips-dev/cplusplus');
|
||||
} catch {}
|
||||
/* istanbul ignore next */
|
||||
return '';
|
||||
};
|
||||
|
||||
const buildSharpLibvipsLibDir = () => {
|
||||
try {
|
||||
return require(`@img/sharp-libvips-dev-${buildPlatformArch()}/lib`);
|
||||
} catch {
|
||||
try {
|
||||
return require(`@img/sharp-libvips-${buildPlatformArch()}/lib`);
|
||||
} catch {}
|
||||
}
|
||||
/* istanbul ignore next */
|
||||
return '';
|
||||
};
|
||||
|
||||
const isUnsupportedNodeRuntime = () => {
|
||||
/* istanbul ignore next */
|
||||
if (process.release?.name === 'node' && process.versions) {
|
||||
if (!semverSatisfies(process.versions.node, engines.node)) {
|
||||
return { found: process.versions.node, expected: engines.node };
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
/* istanbul ignore next */
|
||||
const isEmscripten = () => {
|
||||
const { CC } = process.env;
|
||||
return Boolean(CC && CC.endsWith('/emcc'));
|
||||
};
|
||||
|
||||
const isRosetta = () => {
|
||||
/* istanbul ignore next */
|
||||
if (process.platform === 'darwin' && process.arch === 'x64') {
|
||||
const translated = spawnSync('sysctl sysctl.proc_translated', spawnSyncOptions).stdout;
|
||||
return (translated || '').trim() === 'sysctl.proc_translated: 1';
|
||||
}
|
||||
return false;
|
||||
};
|
||||
|
||||
const sha512 = (s) => createHash('sha512').update(s).digest('hex');
|
||||
|
||||
const yarnLocator = () => {
|
||||
try {
|
||||
const identHash = sha512(`imgsharp-libvips-${buildPlatformArch()}`);
|
||||
const npmVersion = semverCoerce(optionalDependencies[`@img/sharp-libvips-${buildPlatformArch()}`]).version;
|
||||
return sha512(`${identHash}npm:${npmVersion}`).slice(0, 10);
|
||||
} catch {}
|
||||
return '';
|
||||
};
|
||||
|
||||
/* istanbul ignore next */
|
||||
const spawnRebuild = () =>
|
||||
spawnSync(`node-gyp rebuild --directory=src ${isEmscripten() ? '--nodedir=emscripten' : ''}`, {
|
||||
...spawnSyncOptions,
|
||||
stdio: 'inherit'
|
||||
}).status;
|
||||
|
||||
const globalLibvipsVersion = () => {
|
||||
if (process.platform !== 'win32') {
|
||||
const globalLibvipsVersion = spawnSync('pkg-config --modversion vips-cpp', {
|
||||
...spawnSyncOptions,
|
||||
env: {
|
||||
...process.env,
|
||||
PKG_CONFIG_PATH: pkgConfigPath()
|
||||
}
|
||||
}).stdout;
|
||||
/* istanbul ignore next */
|
||||
return (globalLibvipsVersion || '').trim();
|
||||
} else {
|
||||
return '';
|
||||
}
|
||||
};
|
||||
|
||||
/* istanbul ignore next */
|
||||
const pkgConfigPath = () => {
|
||||
if (process.platform !== 'win32') {
|
||||
const brewPkgConfigPath = spawnSync(
|
||||
'which brew >/dev/null 2>&1 && brew environment --plain | grep PKG_CONFIG_LIBDIR | cut -d" " -f2',
|
||||
spawnSyncOptions
|
||||
).stdout || '';
|
||||
return [
|
||||
brewPkgConfigPath.trim(),
|
||||
process.env.PKG_CONFIG_PATH,
|
||||
'/usr/local/lib/pkgconfig',
|
||||
'/usr/lib/pkgconfig',
|
||||
'/usr/local/libdata/pkgconfig',
|
||||
'/usr/libdata/pkgconfig'
|
||||
].filter(Boolean).join(':');
|
||||
} else {
|
||||
return '';
|
||||
}
|
||||
};
|
||||
|
||||
const skipSearch = (status, reason, logger) => {
|
||||
if (logger) {
|
||||
logger(`Detected ${reason}, skipping search for globally-installed libvips`);
|
||||
}
|
||||
return status;
|
||||
};
|
||||
|
||||
const useGlobalLibvips = (logger) => {
|
||||
if (Boolean(process.env.SHARP_IGNORE_GLOBAL_LIBVIPS) === true) {
|
||||
return skipSearch(false, 'SHARP_IGNORE_GLOBAL_LIBVIPS', logger);
|
||||
}
|
||||
if (Boolean(process.env.SHARP_FORCE_GLOBAL_LIBVIPS) === true) {
|
||||
return skipSearch(true, 'SHARP_FORCE_GLOBAL_LIBVIPS', logger);
|
||||
}
|
||||
/* istanbul ignore next */
|
||||
if (isRosetta()) {
|
||||
return skipSearch(false, 'Rosetta', logger);
|
||||
}
|
||||
const globalVipsVersion = globalLibvipsVersion();
|
||||
return !!globalVipsVersion && /* istanbul ignore next */
|
||||
semverGreaterThanOrEqualTo(globalVipsVersion, minimumLibvipsVersion);
|
||||
};
|
||||
|
||||
module.exports = {
|
||||
minimumLibvipsVersion,
|
||||
prebuiltPlatforms,
|
||||
buildPlatformArch,
|
||||
buildSharpLibvipsIncludeDir,
|
||||
buildSharpLibvipsCPlusPlusDir,
|
||||
buildSharpLibvipsLibDir,
|
||||
isUnsupportedNodeRuntime,
|
||||
runtimePlatformArch,
|
||||
log,
|
||||
yarnLocator,
|
||||
spawnRebuild,
|
||||
globalLibvipsVersion,
|
||||
pkgConfigPath,
|
||||
useGlobalLibvips
|
||||
};
|
958
node_modules/sharp/lib/operation.js
generated
vendored
Normal file
958
node_modules/sharp/lib/operation.js
generated
vendored
Normal file
@@ -0,0 +1,958 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
const color = require('color');
|
||||
const is = require('./is');
|
||||
|
||||
/**
|
||||
* How accurate an operation should be.
|
||||
* @member
|
||||
* @private
|
||||
*/
|
||||
const vipsPrecision = {
|
||||
integer: 'integer',
|
||||
float: 'float',
|
||||
approximate: 'approximate'
|
||||
};
|
||||
|
||||
/**
|
||||
* Rotate the output image by either an explicit angle
|
||||
* or auto-orient based on the EXIF `Orientation` tag.
|
||||
*
|
||||
* If an angle is provided, it is converted to a valid positive degree rotation.
|
||||
* For example, `-450` will produce a 270 degree rotation.
|
||||
*
|
||||
* When rotating by an angle other than a multiple of 90,
|
||||
* the background colour can be provided with the `background` option.
|
||||
*
|
||||
* If no angle is provided, it is determined from the EXIF data.
|
||||
* Mirroring is supported and may infer the use of a flip operation.
|
||||
*
|
||||
* The use of `rotate` without an angle will remove the EXIF `Orientation` tag, if any.
|
||||
*
|
||||
* Only one rotation can occur per pipeline.
|
||||
* Previous calls to `rotate` in the same pipeline will be ignored.
|
||||
*
|
||||
* Multi-page images can only be rotated by 180 degrees.
|
||||
*
|
||||
* Method order is important when rotating, resizing and/or extracting regions,
|
||||
* for example `.rotate(x).extract(y)` will produce a different result to `.extract(y).rotate(x)`.
|
||||
*
|
||||
* @example
|
||||
* const pipeline = sharp()
|
||||
* .rotate()
|
||||
* .resize(null, 200)
|
||||
* .toBuffer(function (err, outputBuffer, info) {
|
||||
* // outputBuffer contains 200px high JPEG image data,
|
||||
* // auto-rotated using EXIF Orientation tag
|
||||
* // info.width and info.height contain the dimensions of the resized image
|
||||
* });
|
||||
* readableStream.pipe(pipeline);
|
||||
*
|
||||
* @example
|
||||
* const rotateThenResize = await sharp(input)
|
||||
* .rotate(90)
|
||||
* .resize({ width: 16, height: 8, fit: 'fill' })
|
||||
* .toBuffer();
|
||||
* const resizeThenRotate = await sharp(input)
|
||||
* .resize({ width: 16, height: 8, fit: 'fill' })
|
||||
* .rotate(90)
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {number} [angle=auto] angle of rotation.
|
||||
* @param {Object} [options] - if present, is an Object with optional attributes.
|
||||
* @param {string|Object} [options.background="#000000"] parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function rotate (angle, options) {
|
||||
if (this.options.useExifOrientation || this.options.angle || this.options.rotationAngle) {
|
||||
this.options.debuglog('ignoring previous rotate options');
|
||||
}
|
||||
if (!is.defined(angle)) {
|
||||
this.options.useExifOrientation = true;
|
||||
} else if (is.integer(angle) && !(angle % 90)) {
|
||||
this.options.angle = angle;
|
||||
} else if (is.number(angle)) {
|
||||
this.options.rotationAngle = angle;
|
||||
if (is.object(options) && options.background) {
|
||||
const backgroundColour = color(options.background);
|
||||
this.options.rotationBackground = [
|
||||
backgroundColour.red(),
|
||||
backgroundColour.green(),
|
||||
backgroundColour.blue(),
|
||||
Math.round(backgroundColour.alpha() * 255)
|
||||
];
|
||||
}
|
||||
} else {
|
||||
throw is.invalidParameterError('angle', 'numeric', angle);
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Mirror the image vertically (up-down) about the x-axis.
|
||||
* This always occurs before rotation, if any.
|
||||
*
|
||||
* This operation does not work correctly with multi-page images.
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input).flip().toBuffer();
|
||||
*
|
||||
* @param {Boolean} [flip=true]
|
||||
* @returns {Sharp}
|
||||
*/
|
||||
function flip (flip) {
|
||||
this.options.flip = is.bool(flip) ? flip : true;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Mirror the image horizontally (left-right) about the y-axis.
|
||||
* This always occurs before rotation, if any.
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input).flop().toBuffer();
|
||||
*
|
||||
* @param {Boolean} [flop=true]
|
||||
* @returns {Sharp}
|
||||
*/
|
||||
function flop (flop) {
|
||||
this.options.flop = is.bool(flop) ? flop : true;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Perform an affine transform on an image. This operation will always occur after resizing, extraction and rotation, if any.
|
||||
*
|
||||
* You must provide an array of length 4 or a 2x2 affine transformation matrix.
|
||||
* By default, new pixels are filled with a black background. You can provide a background color with the `background` option.
|
||||
* A particular interpolator may also be specified. Set the `interpolator` option to an attribute of the `sharp.interpolators` Object e.g. `sharp.interpolators.nohalo`.
|
||||
*
|
||||
* In the case of a 2x2 matrix, the transform is:
|
||||
* - X = `matrix[0, 0]` \* (x + `idx`) + `matrix[0, 1]` \* (y + `idy`) + `odx`
|
||||
* - Y = `matrix[1, 0]` \* (x + `idx`) + `matrix[1, 1]` \* (y + `idy`) + `ody`
|
||||
*
|
||||
* where:
|
||||
* - x and y are the coordinates in input image.
|
||||
* - X and Y are the coordinates in output image.
|
||||
* - (0,0) is the upper left corner.
|
||||
*
|
||||
* @since 0.27.0
|
||||
*
|
||||
* @example
|
||||
* const pipeline = sharp()
|
||||
* .affine([[1, 0.3], [0.1, 0.7]], {
|
||||
* background: 'white',
|
||||
* interpolator: sharp.interpolators.nohalo
|
||||
* })
|
||||
* .toBuffer((err, outputBuffer, info) => {
|
||||
* // outputBuffer contains the transformed image
|
||||
* // info.width and info.height contain the new dimensions
|
||||
* });
|
||||
*
|
||||
* inputStream
|
||||
* .pipe(pipeline);
|
||||
*
|
||||
* @param {Array<Array<number>>|Array<number>} matrix - affine transformation matrix
|
||||
* @param {Object} [options] - if present, is an Object with optional attributes.
|
||||
* @param {String|Object} [options.background="#000000"] - parsed by the [color](https://www.npmjs.org/package/color) module to extract values for red, green, blue and alpha.
|
||||
* @param {Number} [options.idx=0] - input horizontal offset
|
||||
* @param {Number} [options.idy=0] - input vertical offset
|
||||
* @param {Number} [options.odx=0] - output horizontal offset
|
||||
* @param {Number} [options.ody=0] - output vertical offset
|
||||
* @param {String} [options.interpolator=sharp.interpolators.bicubic] - interpolator
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function affine (matrix, options) {
|
||||
const flatMatrix = [].concat(...matrix);
|
||||
if (flatMatrix.length === 4 && flatMatrix.every(is.number)) {
|
||||
this.options.affineMatrix = flatMatrix;
|
||||
} else {
|
||||
throw is.invalidParameterError('matrix', '1x4 or 2x2 array', matrix);
|
||||
}
|
||||
|
||||
if (is.defined(options)) {
|
||||
if (is.object(options)) {
|
||||
this._setBackgroundColourOption('affineBackground', options.background);
|
||||
if (is.defined(options.idx)) {
|
||||
if (is.number(options.idx)) {
|
||||
this.options.affineIdx = options.idx;
|
||||
} else {
|
||||
throw is.invalidParameterError('options.idx', 'number', options.idx);
|
||||
}
|
||||
}
|
||||
if (is.defined(options.idy)) {
|
||||
if (is.number(options.idy)) {
|
||||
this.options.affineIdy = options.idy;
|
||||
} else {
|
||||
throw is.invalidParameterError('options.idy', 'number', options.idy);
|
||||
}
|
||||
}
|
||||
if (is.defined(options.odx)) {
|
||||
if (is.number(options.odx)) {
|
||||
this.options.affineOdx = options.odx;
|
||||
} else {
|
||||
throw is.invalidParameterError('options.odx', 'number', options.odx);
|
||||
}
|
||||
}
|
||||
if (is.defined(options.ody)) {
|
||||
if (is.number(options.ody)) {
|
||||
this.options.affineOdy = options.ody;
|
||||
} else {
|
||||
throw is.invalidParameterError('options.ody', 'number', options.ody);
|
||||
}
|
||||
}
|
||||
if (is.defined(options.interpolator)) {
|
||||
if (is.inArray(options.interpolator, Object.values(this.constructor.interpolators))) {
|
||||
this.options.affineInterpolator = options.interpolator;
|
||||
} else {
|
||||
throw is.invalidParameterError('options.interpolator', 'valid interpolator name', options.interpolator);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
throw is.invalidParameterError('options', 'object', options);
|
||||
}
|
||||
}
|
||||
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sharpen the image.
|
||||
*
|
||||
* When used without parameters, performs a fast, mild sharpen of the output image.
|
||||
*
|
||||
* When a `sigma` is provided, performs a slower, more accurate sharpen of the L channel in the LAB colour space.
|
||||
* Fine-grained control over the level of sharpening in "flat" (m1) and "jagged" (m2) areas is available.
|
||||
*
|
||||
* See {@link https://www.libvips.org/API/current/libvips-convolution.html#vips-sharpen|libvips sharpen} operation.
|
||||
*
|
||||
* @example
|
||||
* const data = await sharp(input).sharpen().toBuffer();
|
||||
*
|
||||
* @example
|
||||
* const data = await sharp(input).sharpen({ sigma: 2 }).toBuffer();
|
||||
*
|
||||
* @example
|
||||
* const data = await sharp(input)
|
||||
* .sharpen({
|
||||
* sigma: 2,
|
||||
* m1: 0,
|
||||
* m2: 3,
|
||||
* x1: 3,
|
||||
* y2: 15,
|
||||
* y3: 15,
|
||||
* })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {Object|number} [options] - if present, is an Object with attributes
|
||||
* @param {number} [options.sigma] - the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`, between 0.000001 and 10
|
||||
* @param {number} [options.m1=1.0] - the level of sharpening to apply to "flat" areas, between 0 and 1000000
|
||||
* @param {number} [options.m2=2.0] - the level of sharpening to apply to "jagged" areas, between 0 and 1000000
|
||||
* @param {number} [options.x1=2.0] - threshold between "flat" and "jagged", between 0 and 1000000
|
||||
* @param {number} [options.y2=10.0] - maximum amount of brightening, between 0 and 1000000
|
||||
* @param {number} [options.y3=20.0] - maximum amount of darkening, between 0 and 1000000
|
||||
* @param {number} [flat] - (deprecated) see `options.m1`.
|
||||
* @param {number} [jagged] - (deprecated) see `options.m2`.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function sharpen (options, flat, jagged) {
|
||||
if (!is.defined(options)) {
|
||||
// No arguments: default to mild sharpen
|
||||
this.options.sharpenSigma = -1;
|
||||
} else if (is.bool(options)) {
|
||||
// Deprecated boolean argument: apply mild sharpen?
|
||||
this.options.sharpenSigma = options ? -1 : 0;
|
||||
} else if (is.number(options) && is.inRange(options, 0.01, 10000)) {
|
||||
// Deprecated numeric argument: specific sigma
|
||||
this.options.sharpenSigma = options;
|
||||
// Deprecated control over flat areas
|
||||
if (is.defined(flat)) {
|
||||
if (is.number(flat) && is.inRange(flat, 0, 10000)) {
|
||||
this.options.sharpenM1 = flat;
|
||||
} else {
|
||||
throw is.invalidParameterError('flat', 'number between 0 and 10000', flat);
|
||||
}
|
||||
}
|
||||
// Deprecated control over jagged areas
|
||||
if (is.defined(jagged)) {
|
||||
if (is.number(jagged) && is.inRange(jagged, 0, 10000)) {
|
||||
this.options.sharpenM2 = jagged;
|
||||
} else {
|
||||
throw is.invalidParameterError('jagged', 'number between 0 and 10000', jagged);
|
||||
}
|
||||
}
|
||||
} else if (is.plainObject(options)) {
|
||||
if (is.number(options.sigma) && is.inRange(options.sigma, 0.000001, 10)) {
|
||||
this.options.sharpenSigma = options.sigma;
|
||||
} else {
|
||||
throw is.invalidParameterError('options.sigma', 'number between 0.000001 and 10', options.sigma);
|
||||
}
|
||||
if (is.defined(options.m1)) {
|
||||
if (is.number(options.m1) && is.inRange(options.m1, 0, 1000000)) {
|
||||
this.options.sharpenM1 = options.m1;
|
||||
} else {
|
||||
throw is.invalidParameterError('options.m1', 'number between 0 and 1000000', options.m1);
|
||||
}
|
||||
}
|
||||
if (is.defined(options.m2)) {
|
||||
if (is.number(options.m2) && is.inRange(options.m2, 0, 1000000)) {
|
||||
this.options.sharpenM2 = options.m2;
|
||||
} else {
|
||||
throw is.invalidParameterError('options.m2', 'number between 0 and 1000000', options.m2);
|
||||
}
|
||||
}
|
||||
if (is.defined(options.x1)) {
|
||||
if (is.number(options.x1) && is.inRange(options.x1, 0, 1000000)) {
|
||||
this.options.sharpenX1 = options.x1;
|
||||
} else {
|
||||
throw is.invalidParameterError('options.x1', 'number between 0 and 1000000', options.x1);
|
||||
}
|
||||
}
|
||||
if (is.defined(options.y2)) {
|
||||
if (is.number(options.y2) && is.inRange(options.y2, 0, 1000000)) {
|
||||
this.options.sharpenY2 = options.y2;
|
||||
} else {
|
||||
throw is.invalidParameterError('options.y2', 'number between 0 and 1000000', options.y2);
|
||||
}
|
||||
}
|
||||
if (is.defined(options.y3)) {
|
||||
if (is.number(options.y3) && is.inRange(options.y3, 0, 1000000)) {
|
||||
this.options.sharpenY3 = options.y3;
|
||||
} else {
|
||||
throw is.invalidParameterError('options.y3', 'number between 0 and 1000000', options.y3);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
throw is.invalidParameterError('sigma', 'number between 0.01 and 10000', options);
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply median filter.
|
||||
* When used without parameters the default window is 3x3.
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input).median().toBuffer();
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input).median(5).toBuffer();
|
||||
*
|
||||
* @param {number} [size=3] square mask size: size x size
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function median (size) {
|
||||
if (!is.defined(size)) {
|
||||
// No arguments: default to 3x3
|
||||
this.options.medianSize = 3;
|
||||
} else if (is.integer(size) && is.inRange(size, 1, 1000)) {
|
||||
// Numeric argument: specific sigma
|
||||
this.options.medianSize = size;
|
||||
} else {
|
||||
throw is.invalidParameterError('size', 'integer between 1 and 1000', size);
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Blur the image.
|
||||
*
|
||||
* When used without parameters, performs a fast 3x3 box blur (equivalent to a box linear filter).
|
||||
*
|
||||
* When a `sigma` is provided, performs a slower, more accurate Gaussian blur.
|
||||
*
|
||||
* @example
|
||||
* const boxBlurred = await sharp(input)
|
||||
* .blur()
|
||||
* .toBuffer();
|
||||
*
|
||||
* @example
|
||||
* const gaussianBlurred = await sharp(input)
|
||||
* .blur(5)
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {Object|number|Boolean} [options]
|
||||
* @param {number} [options.sigma] a value between 0.3 and 1000 representing the sigma of the Gaussian mask, where `sigma = 1 + radius / 2`.
|
||||
* @param {string} [options.precision='integer'] How accurate the operation should be, one of: integer, float, approximate.
|
||||
* @param {number} [options.minAmplitude=0.2] A value between 0.001 and 1. A smaller value will generate a larger, more accurate mask.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function blur (options) {
|
||||
let sigma;
|
||||
if (is.number(options)) {
|
||||
sigma = options;
|
||||
} else if (is.plainObject(options)) {
|
||||
if (!is.number(options.sigma)) {
|
||||
throw is.invalidParameterError('options.sigma', 'number between 0.3 and 1000', sigma);
|
||||
}
|
||||
sigma = options.sigma;
|
||||
if ('precision' in options) {
|
||||
if (is.string(vipsPrecision[options.precision])) {
|
||||
this.options.precision = vipsPrecision[options.precision];
|
||||
} else {
|
||||
throw is.invalidParameterError('precision', 'one of: integer, float, approximate', options.precision);
|
||||
}
|
||||
}
|
||||
if ('minAmplitude' in options) {
|
||||
if (is.number(options.minAmplitude) && is.inRange(options.minAmplitude, 0.001, 1)) {
|
||||
this.options.minAmpl = options.minAmplitude;
|
||||
} else {
|
||||
throw is.invalidParameterError('minAmplitude', 'number between 0.001 and 1', options.minAmplitude);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!is.defined(options)) {
|
||||
// No arguments: default to mild blur
|
||||
this.options.blurSigma = -1;
|
||||
} else if (is.bool(options)) {
|
||||
// Boolean argument: apply mild blur?
|
||||
this.options.blurSigma = options ? -1 : 0;
|
||||
} else if (is.number(sigma) && is.inRange(sigma, 0.3, 1000)) {
|
||||
// Numeric argument: specific sigma
|
||||
this.options.blurSigma = sigma;
|
||||
} else {
|
||||
throw is.invalidParameterError('sigma', 'number between 0.3 and 1000', sigma);
|
||||
}
|
||||
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Merge alpha transparency channel, if any, with a background, then remove the alpha channel.
|
||||
*
|
||||
* See also {@link /api-channel#removealpha|removeAlpha}.
|
||||
*
|
||||
* @example
|
||||
* await sharp(rgbaInput)
|
||||
* .flatten({ background: '#F0A703' })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {Object} [options]
|
||||
* @param {string|Object} [options.background={r: 0, g: 0, b: 0}] - background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black.
|
||||
* @returns {Sharp}
|
||||
*/
|
||||
function flatten (options) {
|
||||
this.options.flatten = is.bool(options) ? options : true;
|
||||
if (is.object(options)) {
|
||||
this._setBackgroundColourOption('flattenBackground', options.background);
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Ensure the image has an alpha channel
|
||||
* with all white pixel values made fully transparent.
|
||||
*
|
||||
* Existing alpha channel values for non-white pixels remain unchanged.
|
||||
*
|
||||
* This feature is experimental and the API may change.
|
||||
*
|
||||
* @since 0.32.1
|
||||
*
|
||||
* @example
|
||||
* await sharp(rgbInput)
|
||||
* .unflatten()
|
||||
* .toBuffer();
|
||||
*
|
||||
* @example
|
||||
* await sharp(rgbInput)
|
||||
* .threshold(128, { grayscale: false }) // converter bright pixels to white
|
||||
* .unflatten()
|
||||
* .toBuffer();
|
||||
*/
|
||||
function unflatten () {
|
||||
this.options.unflatten = true;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply a gamma correction by reducing the encoding (darken) pre-resize at a factor of `1/gamma`
|
||||
* then increasing the encoding (brighten) post-resize at a factor of `gamma`.
|
||||
* This can improve the perceived brightness of a resized image in non-linear colour spaces.
|
||||
* JPEG and WebP input images will not take advantage of the shrink-on-load performance optimisation
|
||||
* when applying a gamma correction.
|
||||
*
|
||||
* Supply a second argument to use a different output gamma value, otherwise the first value is used in both cases.
|
||||
*
|
||||
* @param {number} [gamma=2.2] value between 1.0 and 3.0.
|
||||
* @param {number} [gammaOut] value between 1.0 and 3.0. (optional, defaults to same as `gamma`)
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function gamma (gamma, gammaOut) {
|
||||
if (!is.defined(gamma)) {
|
||||
// Default gamma correction of 2.2 (sRGB)
|
||||
this.options.gamma = 2.2;
|
||||
} else if (is.number(gamma) && is.inRange(gamma, 1, 3)) {
|
||||
this.options.gamma = gamma;
|
||||
} else {
|
||||
throw is.invalidParameterError('gamma', 'number between 1.0 and 3.0', gamma);
|
||||
}
|
||||
if (!is.defined(gammaOut)) {
|
||||
// Default gamma correction for output is same as input
|
||||
this.options.gammaOut = this.options.gamma;
|
||||
} else if (is.number(gammaOut) && is.inRange(gammaOut, 1, 3)) {
|
||||
this.options.gammaOut = gammaOut;
|
||||
} else {
|
||||
throw is.invalidParameterError('gammaOut', 'number between 1.0 and 3.0', gammaOut);
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Produce the "negative" of the image.
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input)
|
||||
* .negate()
|
||||
* .toBuffer();
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input)
|
||||
* .negate({ alpha: false })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {Object} [options]
|
||||
* @param {Boolean} [options.alpha=true] Whether or not to negate any alpha channel
|
||||
* @returns {Sharp}
|
||||
*/
|
||||
function negate (options) {
|
||||
this.options.negate = is.bool(options) ? options : true;
|
||||
if (is.plainObject(options) && 'alpha' in options) {
|
||||
if (!is.bool(options.alpha)) {
|
||||
throw is.invalidParameterError('alpha', 'should be boolean value', options.alpha);
|
||||
} else {
|
||||
this.options.negateAlpha = options.alpha;
|
||||
}
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Enhance output image contrast by stretching its luminance to cover a full dynamic range.
|
||||
*
|
||||
* Uses a histogram-based approach, taking a default range of 1% to 99% to reduce sensitivity to noise at the extremes.
|
||||
*
|
||||
* Luminance values below the `lower` percentile will be underexposed by clipping to zero.
|
||||
* Luminance values above the `upper` percentile will be overexposed by clipping to the max pixel value.
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input)
|
||||
* .normalise()
|
||||
* .toBuffer();
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input)
|
||||
* .normalise({ lower: 0, upper: 100 })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {Object} [options]
|
||||
* @param {number} [options.lower=1] - Percentile below which luminance values will be underexposed.
|
||||
* @param {number} [options.upper=99] - Percentile above which luminance values will be overexposed.
|
||||
* @returns {Sharp}
|
||||
*/
|
||||
function normalise (options) {
|
||||
if (is.plainObject(options)) {
|
||||
if (is.defined(options.lower)) {
|
||||
if (is.number(options.lower) && is.inRange(options.lower, 0, 99)) {
|
||||
this.options.normaliseLower = options.lower;
|
||||
} else {
|
||||
throw is.invalidParameterError('lower', 'number between 0 and 99', options.lower);
|
||||
}
|
||||
}
|
||||
if (is.defined(options.upper)) {
|
||||
if (is.number(options.upper) && is.inRange(options.upper, 1, 100)) {
|
||||
this.options.normaliseUpper = options.upper;
|
||||
} else {
|
||||
throw is.invalidParameterError('upper', 'number between 1 and 100', options.upper);
|
||||
}
|
||||
}
|
||||
}
|
||||
if (this.options.normaliseLower >= this.options.normaliseUpper) {
|
||||
throw is.invalidParameterError('range', 'lower to be less than upper',
|
||||
`${this.options.normaliseLower} >= ${this.options.normaliseUpper}`);
|
||||
}
|
||||
this.options.normalise = true;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Alternative spelling of normalise.
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input)
|
||||
* .normalize()
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {Object} [options]
|
||||
* @param {number} [options.lower=1] - Percentile below which luminance values will be underexposed.
|
||||
* @param {number} [options.upper=99] - Percentile above which luminance values will be overexposed.
|
||||
* @returns {Sharp}
|
||||
*/
|
||||
function normalize (options) {
|
||||
return this.normalise(options);
|
||||
}
|
||||
|
||||
/**
|
||||
* Perform contrast limiting adaptive histogram equalization
|
||||
* {@link https://en.wikipedia.org/wiki/Adaptive_histogram_equalization#Contrast_Limited_AHE|CLAHE}.
|
||||
*
|
||||
* This will, in general, enhance the clarity of the image by bringing out darker details.
|
||||
*
|
||||
* @since 0.28.3
|
||||
*
|
||||
* @example
|
||||
* const output = await sharp(input)
|
||||
* .clahe({
|
||||
* width: 3,
|
||||
* height: 3,
|
||||
* })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {Object} options
|
||||
* @param {number} options.width - Integral width of the search window, in pixels.
|
||||
* @param {number} options.height - Integral height of the search window, in pixels.
|
||||
* @param {number} [options.maxSlope=3] - Integral level of brightening, between 0 and 100, where 0 disables contrast limiting.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function clahe (options) {
|
||||
if (is.plainObject(options)) {
|
||||
if (is.integer(options.width) && options.width > 0) {
|
||||
this.options.claheWidth = options.width;
|
||||
} else {
|
||||
throw is.invalidParameterError('width', 'integer greater than zero', options.width);
|
||||
}
|
||||
if (is.integer(options.height) && options.height > 0) {
|
||||
this.options.claheHeight = options.height;
|
||||
} else {
|
||||
throw is.invalidParameterError('height', 'integer greater than zero', options.height);
|
||||
}
|
||||
if (is.defined(options.maxSlope)) {
|
||||
if (is.integer(options.maxSlope) && is.inRange(options.maxSlope, 0, 100)) {
|
||||
this.options.claheMaxSlope = options.maxSlope;
|
||||
} else {
|
||||
throw is.invalidParameterError('maxSlope', 'integer between 0 and 100', options.maxSlope);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
throw is.invalidParameterError('options', 'plain object', options);
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Convolve the image with the specified kernel.
|
||||
*
|
||||
* @example
|
||||
* sharp(input)
|
||||
* .convolve({
|
||||
* width: 3,
|
||||
* height: 3,
|
||||
* kernel: [-1, 0, 1, -2, 0, 2, -1, 0, 1]
|
||||
* })
|
||||
* .raw()
|
||||
* .toBuffer(function(err, data, info) {
|
||||
* // data contains the raw pixel data representing the convolution
|
||||
* // of the input image with the horizontal Sobel operator
|
||||
* });
|
||||
*
|
||||
* @param {Object} kernel
|
||||
* @param {number} kernel.width - width of the kernel in pixels.
|
||||
* @param {number} kernel.height - height of the kernel in pixels.
|
||||
* @param {Array<number>} kernel.kernel - Array of length `width*height` containing the kernel values.
|
||||
* @param {number} [kernel.scale=sum] - the scale of the kernel in pixels.
|
||||
* @param {number} [kernel.offset=0] - the offset of the kernel in pixels.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function convolve (kernel) {
|
||||
if (!is.object(kernel) || !Array.isArray(kernel.kernel) ||
|
||||
!is.integer(kernel.width) || !is.integer(kernel.height) ||
|
||||
!is.inRange(kernel.width, 3, 1001) || !is.inRange(kernel.height, 3, 1001) ||
|
||||
kernel.height * kernel.width !== kernel.kernel.length
|
||||
) {
|
||||
// must pass in a kernel
|
||||
throw new Error('Invalid convolution kernel');
|
||||
}
|
||||
// Default scale is sum of kernel values
|
||||
if (!is.integer(kernel.scale)) {
|
||||
kernel.scale = kernel.kernel.reduce(function (a, b) {
|
||||
return a + b;
|
||||
}, 0);
|
||||
}
|
||||
// Clip scale to a minimum value of 1
|
||||
if (kernel.scale < 1) {
|
||||
kernel.scale = 1;
|
||||
}
|
||||
if (!is.integer(kernel.offset)) {
|
||||
kernel.offset = 0;
|
||||
}
|
||||
this.options.convKernel = kernel;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Any pixel value greater than or equal to the threshold value will be set to 255, otherwise it will be set to 0.
|
||||
* @param {number} [threshold=128] - a value in the range 0-255 representing the level at which the threshold will be applied.
|
||||
* @param {Object} [options]
|
||||
* @param {Boolean} [options.greyscale=true] - convert to single channel greyscale.
|
||||
* @param {Boolean} [options.grayscale=true] - alternative spelling for greyscale.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function threshold (threshold, options) {
|
||||
if (!is.defined(threshold)) {
|
||||
this.options.threshold = 128;
|
||||
} else if (is.bool(threshold)) {
|
||||
this.options.threshold = threshold ? 128 : 0;
|
||||
} else if (is.integer(threshold) && is.inRange(threshold, 0, 255)) {
|
||||
this.options.threshold = threshold;
|
||||
} else {
|
||||
throw is.invalidParameterError('threshold', 'integer between 0 and 255', threshold);
|
||||
}
|
||||
if (!is.object(options) || options.greyscale === true || options.grayscale === true) {
|
||||
this.options.thresholdGrayscale = true;
|
||||
} else {
|
||||
this.options.thresholdGrayscale = false;
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Perform a bitwise boolean operation with operand image.
|
||||
*
|
||||
* This operation creates an output image where each pixel is the result of
|
||||
* the selected bitwise boolean `operation` between the corresponding pixels of the input images.
|
||||
*
|
||||
* @param {Buffer|string} operand - Buffer containing image data or string containing the path to an image file.
|
||||
* @param {string} operator - one of `and`, `or` or `eor` to perform that bitwise operation, like the C logic operators `&`, `|` and `^` respectively.
|
||||
* @param {Object} [options]
|
||||
* @param {Object} [options.raw] - describes operand when using raw pixel data.
|
||||
* @param {number} [options.raw.width]
|
||||
* @param {number} [options.raw.height]
|
||||
* @param {number} [options.raw.channels]
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function boolean (operand, operator, options) {
|
||||
this.options.boolean = this._createInputDescriptor(operand, options);
|
||||
if (is.string(operator) && is.inArray(operator, ['and', 'or', 'eor'])) {
|
||||
this.options.booleanOp = operator;
|
||||
} else {
|
||||
throw is.invalidParameterError('operator', 'one of: and, or, eor', operator);
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply the linear formula `a` * input + `b` to the image to adjust image levels.
|
||||
*
|
||||
* When a single number is provided, it will be used for all image channels.
|
||||
* When an array of numbers is provided, the array length must match the number of channels.
|
||||
*
|
||||
* @example
|
||||
* await sharp(input)
|
||||
* .linear(0.5, 2)
|
||||
* .toBuffer();
|
||||
*
|
||||
* @example
|
||||
* await sharp(rgbInput)
|
||||
* .linear(
|
||||
* [0.25, 0.5, 0.75],
|
||||
* [150, 100, 50]
|
||||
* )
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {(number|number[])} [a=[]] multiplier
|
||||
* @param {(number|number[])} [b=[]] offset
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function linear (a, b) {
|
||||
if (!is.defined(a) && is.number(b)) {
|
||||
a = 1.0;
|
||||
} else if (is.number(a) && !is.defined(b)) {
|
||||
b = 0.0;
|
||||
}
|
||||
if (!is.defined(a)) {
|
||||
this.options.linearA = [];
|
||||
} else if (is.number(a)) {
|
||||
this.options.linearA = [a];
|
||||
} else if (Array.isArray(a) && a.length && a.every(is.number)) {
|
||||
this.options.linearA = a;
|
||||
} else {
|
||||
throw is.invalidParameterError('a', 'number or array of numbers', a);
|
||||
}
|
||||
if (!is.defined(b)) {
|
||||
this.options.linearB = [];
|
||||
} else if (is.number(b)) {
|
||||
this.options.linearB = [b];
|
||||
} else if (Array.isArray(b) && b.length && b.every(is.number)) {
|
||||
this.options.linearB = b;
|
||||
} else {
|
||||
throw is.invalidParameterError('b', 'number or array of numbers', b);
|
||||
}
|
||||
if (this.options.linearA.length !== this.options.linearB.length) {
|
||||
throw new Error('Expected a and b to be arrays of the same length');
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Recombine the image with the specified matrix.
|
||||
*
|
||||
* @since 0.21.1
|
||||
*
|
||||
* @example
|
||||
* sharp(input)
|
||||
* .recomb([
|
||||
* [0.3588, 0.7044, 0.1368],
|
||||
* [0.2990, 0.5870, 0.1140],
|
||||
* [0.2392, 0.4696, 0.0912],
|
||||
* ])
|
||||
* .raw()
|
||||
* .toBuffer(function(err, data, info) {
|
||||
* // data contains the raw pixel data after applying the matrix
|
||||
* // With this example input, a sepia filter has been applied
|
||||
* });
|
||||
*
|
||||
* @param {Array<Array<number>>} inputMatrix - 3x3 or 4x4 Recombination matrix
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function recomb (inputMatrix) {
|
||||
if (!Array.isArray(inputMatrix)) {
|
||||
throw is.invalidParameterError('inputMatrix', 'array', inputMatrix);
|
||||
}
|
||||
if (inputMatrix.length !== 3 && inputMatrix.length !== 4) {
|
||||
throw is.invalidParameterError('inputMatrix', '3x3 or 4x4 array', inputMatrix.length);
|
||||
}
|
||||
const recombMatrix = inputMatrix.flat().map(Number);
|
||||
if (recombMatrix.length !== 9 && recombMatrix.length !== 16) {
|
||||
throw is.invalidParameterError('inputMatrix', 'cardinality of 9 or 16', recombMatrix.length);
|
||||
}
|
||||
this.options.recombMatrix = recombMatrix;
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Transforms the image using brightness, saturation, hue rotation, and lightness.
|
||||
* Brightness and lightness both operate on luminance, with the difference being that
|
||||
* brightness is multiplicative whereas lightness is additive.
|
||||
*
|
||||
* @since 0.22.1
|
||||
*
|
||||
* @example
|
||||
* // increase brightness by a factor of 2
|
||||
* const output = await sharp(input)
|
||||
* .modulate({
|
||||
* brightness: 2
|
||||
* })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @example
|
||||
* // hue-rotate by 180 degrees
|
||||
* const output = await sharp(input)
|
||||
* .modulate({
|
||||
* hue: 180
|
||||
* })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @example
|
||||
* // increase lightness by +50
|
||||
* const output = await sharp(input)
|
||||
* .modulate({
|
||||
* lightness: 50
|
||||
* })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @example
|
||||
* // decrease brightness and saturation while also hue-rotating by 90 degrees
|
||||
* const output = await sharp(input)
|
||||
* .modulate({
|
||||
* brightness: 0.5,
|
||||
* saturation: 0.5,
|
||||
* hue: 90,
|
||||
* })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {Object} [options]
|
||||
* @param {number} [options.brightness] Brightness multiplier
|
||||
* @param {number} [options.saturation] Saturation multiplier
|
||||
* @param {number} [options.hue] Degrees for hue rotation
|
||||
* @param {number} [options.lightness] Lightness addend
|
||||
* @returns {Sharp}
|
||||
*/
|
||||
function modulate (options) {
|
||||
if (!is.plainObject(options)) {
|
||||
throw is.invalidParameterError('options', 'plain object', options);
|
||||
}
|
||||
if ('brightness' in options) {
|
||||
if (is.number(options.brightness) && options.brightness >= 0) {
|
||||
this.options.brightness = options.brightness;
|
||||
} else {
|
||||
throw is.invalidParameterError('brightness', 'number above zero', options.brightness);
|
||||
}
|
||||
}
|
||||
if ('saturation' in options) {
|
||||
if (is.number(options.saturation) && options.saturation >= 0) {
|
||||
this.options.saturation = options.saturation;
|
||||
} else {
|
||||
throw is.invalidParameterError('saturation', 'number above zero', options.saturation);
|
||||
}
|
||||
}
|
||||
if ('hue' in options) {
|
||||
if (is.integer(options.hue)) {
|
||||
this.options.hue = options.hue % 360;
|
||||
} else {
|
||||
throw is.invalidParameterError('hue', 'number', options.hue);
|
||||
}
|
||||
}
|
||||
if ('lightness' in options) {
|
||||
if (is.number(options.lightness)) {
|
||||
this.options.lightness = options.lightness;
|
||||
} else {
|
||||
throw is.invalidParameterError('lightness', 'number', options.lightness);
|
||||
}
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Decorate the Sharp prototype with operation-related functions.
|
||||
* @private
|
||||
*/
|
||||
module.exports = function (Sharp) {
|
||||
Object.assign(Sharp.prototype, {
|
||||
rotate,
|
||||
flip,
|
||||
flop,
|
||||
affine,
|
||||
sharpen,
|
||||
median,
|
||||
blur,
|
||||
flatten,
|
||||
unflatten,
|
||||
gamma,
|
||||
negate,
|
||||
normalise,
|
||||
normalize,
|
||||
clahe,
|
||||
convolve,
|
||||
threshold,
|
||||
boolean,
|
||||
linear,
|
||||
recomb,
|
||||
modulate
|
||||
});
|
||||
};
|
1587
node_modules/sharp/lib/output.js
generated
vendored
Normal file
1587
node_modules/sharp/lib/output.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
587
node_modules/sharp/lib/resize.js
generated
vendored
Normal file
587
node_modules/sharp/lib/resize.js
generated
vendored
Normal file
@@ -0,0 +1,587 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
const is = require('./is');
|
||||
|
||||
/**
|
||||
* Weighting to apply when using contain/cover fit.
|
||||
* @member
|
||||
* @private
|
||||
*/
|
||||
const gravity = {
|
||||
center: 0,
|
||||
centre: 0,
|
||||
north: 1,
|
||||
east: 2,
|
||||
south: 3,
|
||||
west: 4,
|
||||
northeast: 5,
|
||||
southeast: 6,
|
||||
southwest: 7,
|
||||
northwest: 8
|
||||
};
|
||||
|
||||
/**
|
||||
* Position to apply when using contain/cover fit.
|
||||
* @member
|
||||
* @private
|
||||
*/
|
||||
const position = {
|
||||
top: 1,
|
||||
right: 2,
|
||||
bottom: 3,
|
||||
left: 4,
|
||||
'right top': 5,
|
||||
'right bottom': 6,
|
||||
'left bottom': 7,
|
||||
'left top': 8
|
||||
};
|
||||
|
||||
/**
|
||||
* How to extend the image.
|
||||
* @member
|
||||
* @private
|
||||
*/
|
||||
const extendWith = {
|
||||
background: 'background',
|
||||
copy: 'copy',
|
||||
repeat: 'repeat',
|
||||
mirror: 'mirror'
|
||||
};
|
||||
|
||||
/**
|
||||
* Strategies for automagic cover behaviour.
|
||||
* @member
|
||||
* @private
|
||||
*/
|
||||
const strategy = {
|
||||
entropy: 16,
|
||||
attention: 17
|
||||
};
|
||||
|
||||
/**
|
||||
* Reduction kernels.
|
||||
* @member
|
||||
* @private
|
||||
*/
|
||||
const kernel = {
|
||||
nearest: 'nearest',
|
||||
linear: 'linear',
|
||||
cubic: 'cubic',
|
||||
mitchell: 'mitchell',
|
||||
lanczos2: 'lanczos2',
|
||||
lanczos3: 'lanczos3'
|
||||
};
|
||||
|
||||
/**
|
||||
* Methods by which an image can be resized to fit the provided dimensions.
|
||||
* @member
|
||||
* @private
|
||||
*/
|
||||
const fit = {
|
||||
contain: 'contain',
|
||||
cover: 'cover',
|
||||
fill: 'fill',
|
||||
inside: 'inside',
|
||||
outside: 'outside'
|
||||
};
|
||||
|
||||
/**
|
||||
* Map external fit property to internal canvas property.
|
||||
* @member
|
||||
* @private
|
||||
*/
|
||||
const mapFitToCanvas = {
|
||||
contain: 'embed',
|
||||
cover: 'crop',
|
||||
fill: 'ignore_aspect',
|
||||
inside: 'max',
|
||||
outside: 'min'
|
||||
};
|
||||
|
||||
/**
|
||||
* @private
|
||||
*/
|
||||
function isRotationExpected (options) {
|
||||
return (options.angle % 360) !== 0 || options.useExifOrientation === true || options.rotationAngle !== 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* @private
|
||||
*/
|
||||
function isResizeExpected (options) {
|
||||
return options.width !== -1 || options.height !== -1;
|
||||
}
|
||||
|
||||
/**
|
||||
* Resize image to `width`, `height` or `width x height`.
|
||||
*
|
||||
* When both a `width` and `height` are provided, the possible methods by which the image should **fit** these are:
|
||||
* - `cover`: (default) Preserving aspect ratio, attempt to ensure the image covers both provided dimensions by cropping/clipping to fit.
|
||||
* - `contain`: Preserving aspect ratio, contain within both provided dimensions using "letterboxing" where necessary.
|
||||
* - `fill`: Ignore the aspect ratio of the input and stretch to both provided dimensions.
|
||||
* - `inside`: Preserving aspect ratio, resize the image to be as large as possible while ensuring its dimensions are less than or equal to both those specified.
|
||||
* - `outside`: Preserving aspect ratio, resize the image to be as small as possible while ensuring its dimensions are greater than or equal to both those specified.
|
||||
*
|
||||
* Some of these values are based on the [object-fit](https://developer.mozilla.org/en-US/docs/Web/CSS/object-fit) CSS property.
|
||||
*
|
||||
* <img alt="Examples of various values for the fit property when resizing" width="100%" style="aspect-ratio: 998/243" src="https://cdn.jsdelivr.net/gh/lovell/sharp@main/docs/image/api-resize-fit.svg">
|
||||
*
|
||||
* When using a **fit** of `cover` or `contain`, the default **position** is `centre`. Other options are:
|
||||
* - `sharp.position`: `top`, `right top`, `right`, `right bottom`, `bottom`, `left bottom`, `left`, `left top`.
|
||||
* - `sharp.gravity`: `north`, `northeast`, `east`, `southeast`, `south`, `southwest`, `west`, `northwest`, `center` or `centre`.
|
||||
* - `sharp.strategy`: `cover` only, dynamically crop using either the `entropy` or `attention` strategy.
|
||||
*
|
||||
* Some of these values are based on the [object-position](https://developer.mozilla.org/en-US/docs/Web/CSS/object-position) CSS property.
|
||||
*
|
||||
* The strategy-based approach initially resizes so one dimension is at its target length
|
||||
* then repeatedly ranks edge regions, discarding the edge with the lowest score based on the selected strategy.
|
||||
* - `entropy`: focus on the region with the highest [Shannon entropy](https://en.wikipedia.org/wiki/Entropy_%28information_theory%29).
|
||||
* - `attention`: focus on the region with the highest luminance frequency, colour saturation and presence of skin tones.
|
||||
*
|
||||
* Possible downsizing kernels are:
|
||||
* - `nearest`: Use [nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation).
|
||||
* - `linear`: Use a [triangle filter](https://en.wikipedia.org/wiki/Triangular_function).
|
||||
* - `cubic`: Use a [Catmull-Rom spline](https://en.wikipedia.org/wiki/Centripetal_Catmull%E2%80%93Rom_spline).
|
||||
* - `mitchell`: Use a [Mitchell-Netravali spline](https://www.cs.utexas.edu/~fussell/courses/cs384g-fall2013/lectures/mitchell/Mitchell.pdf).
|
||||
* - `lanczos2`: Use a [Lanczos kernel](https://en.wikipedia.org/wiki/Lanczos_resampling#Lanczos_kernel) with `a=2`.
|
||||
* - `lanczos3`: Use a Lanczos kernel with `a=3` (the default).
|
||||
*
|
||||
* When upsampling, these kernels map to `nearest`, `linear` and `cubic` interpolators.
|
||||
* Downsampling kernels without a matching upsampling interpolator map to `cubic`.
|
||||
*
|
||||
* Only one resize can occur per pipeline.
|
||||
* Previous calls to `resize` in the same pipeline will be ignored.
|
||||
*
|
||||
* @example
|
||||
* sharp(input)
|
||||
* .resize({ width: 100 })
|
||||
* .toBuffer()
|
||||
* .then(data => {
|
||||
* // 100 pixels wide, auto-scaled height
|
||||
* });
|
||||
*
|
||||
* @example
|
||||
* sharp(input)
|
||||
* .resize({ height: 100 })
|
||||
* .toBuffer()
|
||||
* .then(data => {
|
||||
* // 100 pixels high, auto-scaled width
|
||||
* });
|
||||
*
|
||||
* @example
|
||||
* sharp(input)
|
||||
* .resize(200, 300, {
|
||||
* kernel: sharp.kernel.nearest,
|
||||
* fit: 'contain',
|
||||
* position: 'right top',
|
||||
* background: { r: 255, g: 255, b: 255, alpha: 0.5 }
|
||||
* })
|
||||
* .toFile('output.png')
|
||||
* .then(() => {
|
||||
* // output.png is a 200 pixels wide and 300 pixels high image
|
||||
* // containing a nearest-neighbour scaled version
|
||||
* // contained within the north-east corner of a semi-transparent white canvas
|
||||
* });
|
||||
*
|
||||
* @example
|
||||
* const transformer = sharp()
|
||||
* .resize({
|
||||
* width: 200,
|
||||
* height: 200,
|
||||
* fit: sharp.fit.cover,
|
||||
* position: sharp.strategy.entropy
|
||||
* });
|
||||
* // Read image data from readableStream
|
||||
* // Write 200px square auto-cropped image data to writableStream
|
||||
* readableStream
|
||||
* .pipe(transformer)
|
||||
* .pipe(writableStream);
|
||||
*
|
||||
* @example
|
||||
* sharp(input)
|
||||
* .resize(200, 200, {
|
||||
* fit: sharp.fit.inside,
|
||||
* withoutEnlargement: true
|
||||
* })
|
||||
* .toFormat('jpeg')
|
||||
* .toBuffer()
|
||||
* .then(function(outputBuffer) {
|
||||
* // outputBuffer contains JPEG image data
|
||||
* // no wider and no higher than 200 pixels
|
||||
* // and no larger than the input image
|
||||
* });
|
||||
*
|
||||
* @example
|
||||
* sharp(input)
|
||||
* .resize(200, 200, {
|
||||
* fit: sharp.fit.outside,
|
||||
* withoutReduction: true
|
||||
* })
|
||||
* .toFormat('jpeg')
|
||||
* .toBuffer()
|
||||
* .then(function(outputBuffer) {
|
||||
* // outputBuffer contains JPEG image data
|
||||
* // of at least 200 pixels wide and 200 pixels high while maintaining aspect ratio
|
||||
* // and no smaller than the input image
|
||||
* });
|
||||
*
|
||||
* @example
|
||||
* const scaleByHalf = await sharp(input)
|
||||
* .metadata()
|
||||
* .then(({ width }) => sharp(input)
|
||||
* .resize(Math.round(width * 0.5))
|
||||
* .toBuffer()
|
||||
* );
|
||||
*
|
||||
* @param {number} [width] - How many pixels wide the resultant image should be. Use `null` or `undefined` to auto-scale the width to match the height.
|
||||
* @param {number} [height] - How many pixels high the resultant image should be. Use `null` or `undefined` to auto-scale the height to match the width.
|
||||
* @param {Object} [options]
|
||||
* @param {number} [options.width] - An alternative means of specifying `width`. If both are present this takes priority.
|
||||
* @param {number} [options.height] - An alternative means of specifying `height`. If both are present this takes priority.
|
||||
* @param {String} [options.fit='cover'] - How the image should be resized/cropped to fit the target dimension(s), one of `cover`, `contain`, `fill`, `inside` or `outside`.
|
||||
* @param {String} [options.position='centre'] - A position, gravity or strategy to use when `fit` is `cover` or `contain`.
|
||||
* @param {String|Object} [options.background={r: 0, g: 0, b: 0, alpha: 1}] - background colour when `fit` is `contain`, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency.
|
||||
* @param {String} [options.kernel='lanczos3'] - The kernel to use for image reduction and the inferred interpolator to use for upsampling. Use the `fastShrinkOnLoad` option to control kernel vs shrink-on-load.
|
||||
* @param {Boolean} [options.withoutEnlargement=false] - Do not scale up if the width *or* height are already less than the target dimensions, equivalent to GraphicsMagick's `>` geometry option. This may result in output dimensions smaller than the target dimensions.
|
||||
* @param {Boolean} [options.withoutReduction=false] - Do not scale down if the width *or* height are already greater than the target dimensions, equivalent to GraphicsMagick's `<` geometry option. This may still result in a crop to reach the target dimensions.
|
||||
* @param {Boolean} [options.fastShrinkOnLoad=true] - Take greater advantage of the JPEG and WebP shrink-on-load feature, which can lead to a slight moiré pattern or round-down of an auto-scaled dimension.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function resize (widthOrOptions, height, options) {
|
||||
if (isResizeExpected(this.options)) {
|
||||
this.options.debuglog('ignoring previous resize options');
|
||||
}
|
||||
if (this.options.widthPost !== -1) {
|
||||
this.options.debuglog('operation order will be: extract, resize, extract');
|
||||
}
|
||||
if (is.defined(widthOrOptions)) {
|
||||
if (is.object(widthOrOptions) && !is.defined(options)) {
|
||||
options = widthOrOptions;
|
||||
} else if (is.integer(widthOrOptions) && widthOrOptions > 0) {
|
||||
this.options.width = widthOrOptions;
|
||||
} else {
|
||||
throw is.invalidParameterError('width', 'positive integer', widthOrOptions);
|
||||
}
|
||||
} else {
|
||||
this.options.width = -1;
|
||||
}
|
||||
if (is.defined(height)) {
|
||||
if (is.integer(height) && height > 0) {
|
||||
this.options.height = height;
|
||||
} else {
|
||||
throw is.invalidParameterError('height', 'positive integer', height);
|
||||
}
|
||||
} else {
|
||||
this.options.height = -1;
|
||||
}
|
||||
if (is.object(options)) {
|
||||
// Width
|
||||
if (is.defined(options.width)) {
|
||||
if (is.integer(options.width) && options.width > 0) {
|
||||
this.options.width = options.width;
|
||||
} else {
|
||||
throw is.invalidParameterError('width', 'positive integer', options.width);
|
||||
}
|
||||
}
|
||||
// Height
|
||||
if (is.defined(options.height)) {
|
||||
if (is.integer(options.height) && options.height > 0) {
|
||||
this.options.height = options.height;
|
||||
} else {
|
||||
throw is.invalidParameterError('height', 'positive integer', options.height);
|
||||
}
|
||||
}
|
||||
// Fit
|
||||
if (is.defined(options.fit)) {
|
||||
const canvas = mapFitToCanvas[options.fit];
|
||||
if (is.string(canvas)) {
|
||||
this.options.canvas = canvas;
|
||||
} else {
|
||||
throw is.invalidParameterError('fit', 'valid fit', options.fit);
|
||||
}
|
||||
}
|
||||
// Position
|
||||
if (is.defined(options.position)) {
|
||||
const pos = is.integer(options.position)
|
||||
? options.position
|
||||
: strategy[options.position] || position[options.position] || gravity[options.position];
|
||||
if (is.integer(pos) && (is.inRange(pos, 0, 8) || is.inRange(pos, 16, 17))) {
|
||||
this.options.position = pos;
|
||||
} else {
|
||||
throw is.invalidParameterError('position', 'valid position/gravity/strategy', options.position);
|
||||
}
|
||||
}
|
||||
// Background
|
||||
this._setBackgroundColourOption('resizeBackground', options.background);
|
||||
// Kernel
|
||||
if (is.defined(options.kernel)) {
|
||||
if (is.string(kernel[options.kernel])) {
|
||||
this.options.kernel = kernel[options.kernel];
|
||||
} else {
|
||||
throw is.invalidParameterError('kernel', 'valid kernel name', options.kernel);
|
||||
}
|
||||
}
|
||||
// Without enlargement
|
||||
if (is.defined(options.withoutEnlargement)) {
|
||||
this._setBooleanOption('withoutEnlargement', options.withoutEnlargement);
|
||||
}
|
||||
// Without reduction
|
||||
if (is.defined(options.withoutReduction)) {
|
||||
this._setBooleanOption('withoutReduction', options.withoutReduction);
|
||||
}
|
||||
// Shrink on load
|
||||
if (is.defined(options.fastShrinkOnLoad)) {
|
||||
this._setBooleanOption('fastShrinkOnLoad', options.fastShrinkOnLoad);
|
||||
}
|
||||
}
|
||||
if (isRotationExpected(this.options) && isResizeExpected(this.options)) {
|
||||
this.options.rotateBeforePreExtract = true;
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extend / pad / extrude one or more edges of the image with either
|
||||
* the provided background colour or pixels derived from the image.
|
||||
* This operation will always occur after resizing and extraction, if any.
|
||||
*
|
||||
* @example
|
||||
* // Resize to 140 pixels wide, then add 10 transparent pixels
|
||||
* // to the top, left and right edges and 20 to the bottom edge
|
||||
* sharp(input)
|
||||
* .resize(140)
|
||||
* .extend({
|
||||
* top: 10,
|
||||
* bottom: 20,
|
||||
* left: 10,
|
||||
* right: 10,
|
||||
* background: { r: 0, g: 0, b: 0, alpha: 0 }
|
||||
* })
|
||||
* ...
|
||||
*
|
||||
* @example
|
||||
* // Add a row of 10 red pixels to the bottom
|
||||
* sharp(input)
|
||||
* .extend({
|
||||
* bottom: 10,
|
||||
* background: 'red'
|
||||
* })
|
||||
* ...
|
||||
*
|
||||
* @example
|
||||
* // Extrude image by 8 pixels to the right, mirroring existing right hand edge
|
||||
* sharp(input)
|
||||
* .extend({
|
||||
* right: 8,
|
||||
* background: 'mirror'
|
||||
* })
|
||||
* ...
|
||||
*
|
||||
* @param {(number|Object)} extend - single pixel count to add to all edges or an Object with per-edge counts
|
||||
* @param {number} [extend.top=0]
|
||||
* @param {number} [extend.left=0]
|
||||
* @param {number} [extend.bottom=0]
|
||||
* @param {number} [extend.right=0]
|
||||
* @param {String} [extend.extendWith='background'] - populate new pixels using this method, one of: background, copy, repeat, mirror.
|
||||
* @param {String|Object} [extend.background={r: 0, g: 0, b: 0, alpha: 1}] - background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to black without transparency.
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function extend (extend) {
|
||||
if (is.integer(extend) && extend > 0) {
|
||||
this.options.extendTop = extend;
|
||||
this.options.extendBottom = extend;
|
||||
this.options.extendLeft = extend;
|
||||
this.options.extendRight = extend;
|
||||
} else if (is.object(extend)) {
|
||||
if (is.defined(extend.top)) {
|
||||
if (is.integer(extend.top) && extend.top >= 0) {
|
||||
this.options.extendTop = extend.top;
|
||||
} else {
|
||||
throw is.invalidParameterError('top', 'positive integer', extend.top);
|
||||
}
|
||||
}
|
||||
if (is.defined(extend.bottom)) {
|
||||
if (is.integer(extend.bottom) && extend.bottom >= 0) {
|
||||
this.options.extendBottom = extend.bottom;
|
||||
} else {
|
||||
throw is.invalidParameterError('bottom', 'positive integer', extend.bottom);
|
||||
}
|
||||
}
|
||||
if (is.defined(extend.left)) {
|
||||
if (is.integer(extend.left) && extend.left >= 0) {
|
||||
this.options.extendLeft = extend.left;
|
||||
} else {
|
||||
throw is.invalidParameterError('left', 'positive integer', extend.left);
|
||||
}
|
||||
}
|
||||
if (is.defined(extend.right)) {
|
||||
if (is.integer(extend.right) && extend.right >= 0) {
|
||||
this.options.extendRight = extend.right;
|
||||
} else {
|
||||
throw is.invalidParameterError('right', 'positive integer', extend.right);
|
||||
}
|
||||
}
|
||||
this._setBackgroundColourOption('extendBackground', extend.background);
|
||||
if (is.defined(extend.extendWith)) {
|
||||
if (is.string(extendWith[extend.extendWith])) {
|
||||
this.options.extendWith = extendWith[extend.extendWith];
|
||||
} else {
|
||||
throw is.invalidParameterError('extendWith', 'one of: background, copy, repeat, mirror', extend.extendWith);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
throw is.invalidParameterError('extend', 'integer or object', extend);
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract/crop a region of the image.
|
||||
*
|
||||
* - Use `extract` before `resize` for pre-resize extraction.
|
||||
* - Use `extract` after `resize` for post-resize extraction.
|
||||
* - Use `extract` twice and `resize` once for extract-then-resize-then-extract in a fixed operation order.
|
||||
*
|
||||
* @example
|
||||
* sharp(input)
|
||||
* .extract({ left: left, top: top, width: width, height: height })
|
||||
* .toFile(output, function(err) {
|
||||
* // Extract a region of the input image, saving in the same format.
|
||||
* });
|
||||
* @example
|
||||
* sharp(input)
|
||||
* .extract({ left: leftOffsetPre, top: topOffsetPre, width: widthPre, height: heightPre })
|
||||
* .resize(width, height)
|
||||
* .extract({ left: leftOffsetPost, top: topOffsetPost, width: widthPost, height: heightPost })
|
||||
* .toFile(output, function(err) {
|
||||
* // Extract a region, resize, then extract from the resized image
|
||||
* });
|
||||
*
|
||||
* @param {Object} options - describes the region to extract using integral pixel values
|
||||
* @param {number} options.left - zero-indexed offset from left edge
|
||||
* @param {number} options.top - zero-indexed offset from top edge
|
||||
* @param {number} options.width - width of region to extract
|
||||
* @param {number} options.height - height of region to extract
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function extract (options) {
|
||||
const suffix = isResizeExpected(this.options) || this.options.widthPre !== -1 ? 'Post' : 'Pre';
|
||||
if (this.options[`width${suffix}`] !== -1) {
|
||||
this.options.debuglog('ignoring previous extract options');
|
||||
}
|
||||
['left', 'top', 'width', 'height'].forEach(function (name) {
|
||||
const value = options[name];
|
||||
if (is.integer(value) && value >= 0) {
|
||||
this.options[name + (name === 'left' || name === 'top' ? 'Offset' : '') + suffix] = value;
|
||||
} else {
|
||||
throw is.invalidParameterError(name, 'integer', value);
|
||||
}
|
||||
}, this);
|
||||
// Ensure existing rotation occurs before pre-resize extraction
|
||||
if (isRotationExpected(this.options) && !isResizeExpected(this.options)) {
|
||||
if (this.options.widthPre === -1 || this.options.widthPost === -1) {
|
||||
this.options.rotateBeforePreExtract = true;
|
||||
}
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Trim pixels from all edges that contain values similar to the given background colour, which defaults to that of the top-left pixel.
|
||||
*
|
||||
* Images with an alpha channel will use the combined bounding box of alpha and non-alpha channels.
|
||||
*
|
||||
* If the result of this operation would trim an image to nothing then no change is made.
|
||||
*
|
||||
* The `info` response Object will contain `trimOffsetLeft` and `trimOffsetTop` properties.
|
||||
*
|
||||
* @example
|
||||
* // Trim pixels with a colour similar to that of the top-left pixel.
|
||||
* await sharp(input)
|
||||
* .trim()
|
||||
* .toFile(output);
|
||||
*
|
||||
* @example
|
||||
* // Trim pixels with the exact same colour as that of the top-left pixel.
|
||||
* await sharp(input)
|
||||
* .trim({
|
||||
* threshold: 0
|
||||
* })
|
||||
* .toFile(output);
|
||||
*
|
||||
* @example
|
||||
* // Assume input is line art and trim only pixels with a similar colour to red.
|
||||
* const output = await sharp(input)
|
||||
* .trim({
|
||||
* background: "#FF0000",
|
||||
* lineArt: true
|
||||
* })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @example
|
||||
* // Trim all "yellow-ish" pixels, being more lenient with the higher threshold.
|
||||
* const output = await sharp(input)
|
||||
* .trim({
|
||||
* background: "yellow",
|
||||
* threshold: 42,
|
||||
* })
|
||||
* .toBuffer();
|
||||
*
|
||||
* @param {Object} [options]
|
||||
* @param {string|Object} [options.background='top-left pixel'] - Background colour, parsed by the [color](https://www.npmjs.org/package/color) module, defaults to that of the top-left pixel.
|
||||
* @param {number} [options.threshold=10] - Allowed difference from the above colour, a positive number.
|
||||
* @param {boolean} [options.lineArt=false] - Does the input more closely resemble line art (e.g. vector) rather than being photographic?
|
||||
* @returns {Sharp}
|
||||
* @throws {Error} Invalid parameters
|
||||
*/
|
||||
function trim (options) {
|
||||
this.options.trimThreshold = 10;
|
||||
if (is.defined(options)) {
|
||||
if (is.object(options)) {
|
||||
if (is.defined(options.background)) {
|
||||
this._setBackgroundColourOption('trimBackground', options.background);
|
||||
}
|
||||
if (is.defined(options.threshold)) {
|
||||
if (is.number(options.threshold) && options.threshold >= 0) {
|
||||
this.options.trimThreshold = options.threshold;
|
||||
} else {
|
||||
throw is.invalidParameterError('threshold', 'positive number', options.threshold);
|
||||
}
|
||||
}
|
||||
if (is.defined(options.lineArt)) {
|
||||
this._setBooleanOption('trimLineArt', options.lineArt);
|
||||
}
|
||||
} else {
|
||||
throw is.invalidParameterError('trim', 'object', options);
|
||||
}
|
||||
}
|
||||
if (isRotationExpected(this.options)) {
|
||||
this.options.rotateBeforePreExtract = true;
|
||||
}
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Decorate the Sharp prototype with resize-related functions.
|
||||
* @private
|
||||
*/
|
||||
module.exports = function (Sharp) {
|
||||
Object.assign(Sharp.prototype, {
|
||||
resize,
|
||||
extend,
|
||||
extract,
|
||||
trim
|
||||
});
|
||||
// Class attributes
|
||||
Sharp.gravity = gravity;
|
||||
Sharp.strategy = strategy;
|
||||
Sharp.kernel = kernel;
|
||||
Sharp.fit = fit;
|
||||
Sharp.position = position;
|
||||
};
|
114
node_modules/sharp/lib/sharp.js
generated
vendored
Normal file
114
node_modules/sharp/lib/sharp.js
generated
vendored
Normal file
@@ -0,0 +1,114 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
// Inspects the runtime environment and exports the relevant sharp.node binary
|
||||
|
||||
const { familySync, versionSync } = require('detect-libc');
|
||||
|
||||
const { runtimePlatformArch, isUnsupportedNodeRuntime, prebuiltPlatforms, minimumLibvipsVersion } = require('./libvips');
|
||||
const runtimePlatform = runtimePlatformArch();
|
||||
|
||||
const paths = [
|
||||
`../src/build/Release/sharp-${runtimePlatform}.node`,
|
||||
'../src/build/Release/sharp-wasm32.node',
|
||||
`@img/sharp-${runtimePlatform}/sharp.node`,
|
||||
'@img/sharp-wasm32/sharp.node'
|
||||
];
|
||||
|
||||
let sharp;
|
||||
const errors = [];
|
||||
for (const path of paths) {
|
||||
try {
|
||||
sharp = require(path);
|
||||
break;
|
||||
} catch (err) {
|
||||
/* istanbul ignore next */
|
||||
errors.push(err);
|
||||
}
|
||||
}
|
||||
|
||||
/* istanbul ignore next */
|
||||
if (sharp) {
|
||||
module.exports = sharp;
|
||||
} else {
|
||||
const [isLinux, isMacOs, isWindows] = ['linux', 'darwin', 'win32'].map(os => runtimePlatform.startsWith(os));
|
||||
|
||||
const help = [`Could not load the "sharp" module using the ${runtimePlatform} runtime`];
|
||||
errors.forEach(err => {
|
||||
if (err.code !== 'MODULE_NOT_FOUND') {
|
||||
help.push(`${err.code}: ${err.message}`);
|
||||
}
|
||||
});
|
||||
const messages = errors.map(err => err.message).join(' ');
|
||||
help.push('Possible solutions:');
|
||||
// Common error messages
|
||||
if (isUnsupportedNodeRuntime()) {
|
||||
const { found, expected } = isUnsupportedNodeRuntime();
|
||||
help.push(
|
||||
'- Please upgrade Node.js:',
|
||||
` Found ${found}`,
|
||||
` Requires ${expected}`
|
||||
);
|
||||
} else if (prebuiltPlatforms.includes(runtimePlatform)) {
|
||||
const [os, cpu] = runtimePlatform.split('-');
|
||||
const libc = os.endsWith('musl') ? ' --libc=musl' : '';
|
||||
help.push(
|
||||
'- Ensure optional dependencies can be installed:',
|
||||
' npm install --include=optional sharp',
|
||||
'- Ensure your package manager supports multi-platform installation:',
|
||||
' See https://sharp.pixelplumbing.com/install#cross-platform',
|
||||
'- Add platform-specific dependencies:',
|
||||
` npm install --os=${os.replace('musl', '')}${libc} --cpu=${cpu} sharp`
|
||||
);
|
||||
} else {
|
||||
help.push(
|
||||
`- Manually install libvips >= ${minimumLibvipsVersion}`,
|
||||
'- Add experimental WebAssembly-based dependencies:',
|
||||
' npm install --cpu=wasm32 sharp',
|
||||
' npm install @img/sharp-wasm32'
|
||||
);
|
||||
}
|
||||
if (isLinux && /(symbol not found|CXXABI_)/i.test(messages)) {
|
||||
try {
|
||||
const { config } = require(`@img/sharp-libvips-${runtimePlatform}/package`);
|
||||
const libcFound = `${familySync()} ${versionSync()}`;
|
||||
const libcRequires = `${config.musl ? 'musl' : 'glibc'} ${config.musl || config.glibc}`;
|
||||
help.push(
|
||||
'- Update your OS:',
|
||||
` Found ${libcFound}`,
|
||||
` Requires ${libcRequires}`
|
||||
);
|
||||
} catch (errEngines) {}
|
||||
}
|
||||
if (isLinux && /\/snap\/core[0-9]{2}/.test(messages)) {
|
||||
help.push(
|
||||
'- Remove the Node.js Snap, which does not support native modules',
|
||||
' snap remove node'
|
||||
);
|
||||
}
|
||||
if (isMacOs && /Incompatible library version/.test(messages)) {
|
||||
help.push(
|
||||
'- Update Homebrew:',
|
||||
' brew update && brew upgrade vips'
|
||||
);
|
||||
}
|
||||
if (errors.some(err => err.code === 'ERR_DLOPEN_DISABLED')) {
|
||||
help.push('- Run Node.js without using the --no-addons flag');
|
||||
}
|
||||
// Link to installation docs
|
||||
if (isWindows && /The specified procedure could not be found/.test(messages)) {
|
||||
help.push(
|
||||
'- Using the canvas package on Windows?',
|
||||
' See https://sharp.pixelplumbing.com/install#canvas-and-windows',
|
||||
'- Check for outdated versions of sharp in the dependency tree:',
|
||||
' npm ls sharp'
|
||||
);
|
||||
}
|
||||
help.push(
|
||||
'- Consult the installation documentation:',
|
||||
' See https://sharp.pixelplumbing.com/install'
|
||||
);
|
||||
throw new Error(help.join('\n'));
|
||||
}
|
296
node_modules/sharp/lib/utility.js
generated
vendored
Normal file
296
node_modules/sharp/lib/utility.js
generated
vendored
Normal file
@@ -0,0 +1,296 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
'use strict';
|
||||
|
||||
const events = require('node:events');
|
||||
const detectLibc = require('detect-libc');
|
||||
|
||||
const is = require('./is');
|
||||
const { runtimePlatformArch } = require('./libvips');
|
||||
const sharp = require('./sharp');
|
||||
|
||||
const runtimePlatform = runtimePlatformArch();
|
||||
const libvipsVersion = sharp.libvipsVersion();
|
||||
|
||||
/**
|
||||
* An Object containing nested boolean values representing the available input and output formats/methods.
|
||||
* @member
|
||||
* @example
|
||||
* console.log(sharp.format);
|
||||
* @returns {Object}
|
||||
*/
|
||||
const format = sharp.format();
|
||||
format.heif.output.alias = ['avif', 'heic'];
|
||||
format.jpeg.output.alias = ['jpe', 'jpg'];
|
||||
format.tiff.output.alias = ['tif'];
|
||||
format.jp2k.output.alias = ['j2c', 'j2k', 'jp2', 'jpx'];
|
||||
|
||||
/**
|
||||
* An Object containing the available interpolators and their proper values
|
||||
* @readonly
|
||||
* @enum {string}
|
||||
*/
|
||||
const interpolators = {
|
||||
/** [Nearest neighbour interpolation](http://en.wikipedia.org/wiki/Nearest-neighbor_interpolation). Suitable for image enlargement only. */
|
||||
nearest: 'nearest',
|
||||
/** [Bilinear interpolation](http://en.wikipedia.org/wiki/Bilinear_interpolation). Faster than bicubic but with less smooth results. */
|
||||
bilinear: 'bilinear',
|
||||
/** [Bicubic interpolation](http://en.wikipedia.org/wiki/Bicubic_interpolation) (the default). */
|
||||
bicubic: 'bicubic',
|
||||
/** [LBB interpolation](https://github.com/libvips/libvips/blob/master/libvips/resample/lbb.cpp#L100). Prevents some "[acutance](http://en.wikipedia.org/wiki/Acutance)" but typically reduces performance by a factor of 2. */
|
||||
locallyBoundedBicubic: 'lbb',
|
||||
/** [Nohalo interpolation](http://eprints.soton.ac.uk/268086/). Prevents acutance but typically reduces performance by a factor of 3. */
|
||||
nohalo: 'nohalo',
|
||||
/** [VSQBS interpolation](https://github.com/libvips/libvips/blob/master/libvips/resample/vsqbs.cpp#L48). Prevents "staircasing" when enlarging. */
|
||||
vertexSplitQuadraticBasisSpline: 'vsqbs'
|
||||
};
|
||||
|
||||
/**
|
||||
* An Object containing the version numbers of sharp, libvips
|
||||
* and (when using prebuilt binaries) its dependencies.
|
||||
*
|
||||
* @member
|
||||
* @example
|
||||
* console.log(sharp.versions);
|
||||
*/
|
||||
let versions = {
|
||||
vips: libvipsVersion.semver
|
||||
};
|
||||
/* istanbul ignore next */
|
||||
if (!libvipsVersion.isGlobal) {
|
||||
if (!libvipsVersion.isWasm) {
|
||||
try {
|
||||
versions = require(`@img/sharp-${runtimePlatform}/versions`);
|
||||
} catch (_) {
|
||||
try {
|
||||
versions = require(`@img/sharp-libvips-${runtimePlatform}/versions`);
|
||||
} catch (_) {}
|
||||
}
|
||||
} else {
|
||||
try {
|
||||
versions = require('@img/sharp-wasm32/versions');
|
||||
} catch (_) {}
|
||||
}
|
||||
}
|
||||
versions.sharp = require('../package.json').version;
|
||||
|
||||
/* istanbul ignore next */
|
||||
if (versions.heif && format.heif) {
|
||||
// Prebuilt binaries provide AV1
|
||||
format.heif.input.fileSuffix = ['.avif'];
|
||||
format.heif.output.alias = ['avif'];
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets or, when options are provided, sets the limits of _libvips'_ operation cache.
|
||||
* Existing entries in the cache will be trimmed after any change in limits.
|
||||
* This method always returns cache statistics,
|
||||
* useful for determining how much working memory is required for a particular task.
|
||||
*
|
||||
* @example
|
||||
* const stats = sharp.cache();
|
||||
* @example
|
||||
* sharp.cache( { items: 200 } );
|
||||
* sharp.cache( { files: 0 } );
|
||||
* sharp.cache(false);
|
||||
*
|
||||
* @param {Object|boolean} [options=true] - Object with the following attributes, or boolean where true uses default cache settings and false removes all caching
|
||||
* @param {number} [options.memory=50] - is the maximum memory in MB to use for this cache
|
||||
* @param {number} [options.files=20] - is the maximum number of files to hold open
|
||||
* @param {number} [options.items=100] - is the maximum number of operations to cache
|
||||
* @returns {Object}
|
||||
*/
|
||||
function cache (options) {
|
||||
if (is.bool(options)) {
|
||||
if (options) {
|
||||
// Default cache settings of 50MB, 20 files, 100 items
|
||||
return sharp.cache(50, 20, 100);
|
||||
} else {
|
||||
return sharp.cache(0, 0, 0);
|
||||
}
|
||||
} else if (is.object(options)) {
|
||||
return sharp.cache(options.memory, options.files, options.items);
|
||||
} else {
|
||||
return sharp.cache();
|
||||
}
|
||||
}
|
||||
cache(true);
|
||||
|
||||
/**
|
||||
* Gets or, when a concurrency is provided, sets
|
||||
* the maximum number of threads _libvips_ should use to process _each image_.
|
||||
* These are from a thread pool managed by glib,
|
||||
* which helps avoid the overhead of creating new threads.
|
||||
*
|
||||
* This method always returns the current concurrency.
|
||||
*
|
||||
* The default value is the number of CPU cores,
|
||||
* except when using glibc-based Linux without jemalloc,
|
||||
* where the default is `1` to help reduce memory fragmentation.
|
||||
*
|
||||
* A value of `0` will reset this to the number of CPU cores.
|
||||
*
|
||||
* Some image format libraries spawn additional threads,
|
||||
* e.g. libaom manages its own 4 threads when encoding AVIF images,
|
||||
* and these are independent of the value set here.
|
||||
*
|
||||
* The maximum number of images that sharp can process in parallel
|
||||
* is controlled by libuv's `UV_THREADPOOL_SIZE` environment variable,
|
||||
* which defaults to 4.
|
||||
*
|
||||
* https://nodejs.org/api/cli.html#uv_threadpool_sizesize
|
||||
*
|
||||
* For example, by default, a machine with 8 CPU cores will process
|
||||
* 4 images in parallel and use up to 8 threads per image,
|
||||
* so there will be up to 32 concurrent threads.
|
||||
*
|
||||
* @example
|
||||
* const threads = sharp.concurrency(); // 4
|
||||
* sharp.concurrency(2); // 2
|
||||
* sharp.concurrency(0); // 4
|
||||
*
|
||||
* @param {number} [concurrency]
|
||||
* @returns {number} concurrency
|
||||
*/
|
||||
function concurrency (concurrency) {
|
||||
return sharp.concurrency(is.integer(concurrency) ? concurrency : null);
|
||||
}
|
||||
/* istanbul ignore next */
|
||||
if (detectLibc.familySync() === detectLibc.GLIBC && !sharp._isUsingJemalloc()) {
|
||||
// Reduce default concurrency to 1 when using glibc memory allocator
|
||||
sharp.concurrency(1);
|
||||
} else if (detectLibc.familySync() === detectLibc.MUSL && sharp.concurrency() === 1024) {
|
||||
// Reduce default concurrency when musl thread over-subscription detected
|
||||
sharp.concurrency(require('node:os').availableParallelism());
|
||||
}
|
||||
|
||||
/**
|
||||
* An EventEmitter that emits a `change` event when a task is either:
|
||||
* - queued, waiting for _libuv_ to provide a worker thread
|
||||
* - complete
|
||||
* @member
|
||||
* @example
|
||||
* sharp.queue.on('change', function(queueLength) {
|
||||
* console.log('Queue contains ' + queueLength + ' task(s)');
|
||||
* });
|
||||
*/
|
||||
const queue = new events.EventEmitter();
|
||||
|
||||
/**
|
||||
* Provides access to internal task counters.
|
||||
* - queue is the number of tasks this module has queued waiting for _libuv_ to provide a worker thread from its pool.
|
||||
* - process is the number of resize tasks currently being processed.
|
||||
*
|
||||
* @example
|
||||
* const counters = sharp.counters(); // { queue: 2, process: 4 }
|
||||
*
|
||||
* @returns {Object}
|
||||
*/
|
||||
function counters () {
|
||||
return sharp.counters();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get and set use of SIMD vector unit instructions.
|
||||
* Requires libvips to have been compiled with highway support.
|
||||
*
|
||||
* Improves the performance of `resize`, `blur` and `sharpen` operations
|
||||
* by taking advantage of the SIMD vector unit of the CPU, e.g. Intel SSE and ARM NEON.
|
||||
*
|
||||
* @example
|
||||
* const simd = sharp.simd();
|
||||
* // simd is `true` if the runtime use of highway is currently enabled
|
||||
* @example
|
||||
* const simd = sharp.simd(false);
|
||||
* // prevent libvips from using highway at runtime
|
||||
*
|
||||
* @param {boolean} [simd=true]
|
||||
* @returns {boolean}
|
||||
*/
|
||||
function simd (simd) {
|
||||
return sharp.simd(is.bool(simd) ? simd : null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Block libvips operations at runtime.
|
||||
*
|
||||
* This is in addition to the `VIPS_BLOCK_UNTRUSTED` environment variable,
|
||||
* which when set will block all "untrusted" operations.
|
||||
*
|
||||
* @since 0.32.4
|
||||
*
|
||||
* @example <caption>Block all TIFF input.</caption>
|
||||
* sharp.block({
|
||||
* operation: ['VipsForeignLoadTiff']
|
||||
* });
|
||||
*
|
||||
* @param {Object} options
|
||||
* @param {Array<string>} options.operation - List of libvips low-level operation names to block.
|
||||
*/
|
||||
function block (options) {
|
||||
if (is.object(options)) {
|
||||
if (Array.isArray(options.operation) && options.operation.every(is.string)) {
|
||||
sharp.block(options.operation, true);
|
||||
} else {
|
||||
throw is.invalidParameterError('operation', 'Array<string>', options.operation);
|
||||
}
|
||||
} else {
|
||||
throw is.invalidParameterError('options', 'object', options);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Unblock libvips operations at runtime.
|
||||
*
|
||||
* This is useful for defining a list of allowed operations.
|
||||
*
|
||||
* @since 0.32.4
|
||||
*
|
||||
* @example <caption>Block all input except WebP from the filesystem.</caption>
|
||||
* sharp.block({
|
||||
* operation: ['VipsForeignLoad']
|
||||
* });
|
||||
* sharp.unblock({
|
||||
* operation: ['VipsForeignLoadWebpFile']
|
||||
* });
|
||||
*
|
||||
* @example <caption>Block all input except JPEG and PNG from a Buffer or Stream.</caption>
|
||||
* sharp.block({
|
||||
* operation: ['VipsForeignLoad']
|
||||
* });
|
||||
* sharp.unblock({
|
||||
* operation: ['VipsForeignLoadJpegBuffer', 'VipsForeignLoadPngBuffer']
|
||||
* });
|
||||
*
|
||||
* @param {Object} options
|
||||
* @param {Array<string>} options.operation - List of libvips low-level operation names to unblock.
|
||||
*/
|
||||
function unblock (options) {
|
||||
if (is.object(options)) {
|
||||
if (Array.isArray(options.operation) && options.operation.every(is.string)) {
|
||||
sharp.block(options.operation, false);
|
||||
} else {
|
||||
throw is.invalidParameterError('operation', 'Array<string>', options.operation);
|
||||
}
|
||||
} else {
|
||||
throw is.invalidParameterError('options', 'object', options);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Decorate the Sharp class with utility-related functions.
|
||||
* @private
|
||||
*/
|
||||
module.exports = function (Sharp) {
|
||||
Sharp.cache = cache;
|
||||
Sharp.concurrency = concurrency;
|
||||
Sharp.counters = counters;
|
||||
Sharp.simd = simd;
|
||||
Sharp.format = format;
|
||||
Sharp.interpolators = interpolators;
|
||||
Sharp.versions = versions;
|
||||
Sharp.queue = queue;
|
||||
Sharp.block = block;
|
||||
Sharp.unblock = unblock;
|
||||
};
|
222
node_modules/sharp/package.json
generated
vendored
Normal file
222
node_modules/sharp/package.json
generated
vendored
Normal file
@@ -0,0 +1,222 @@
|
||||
{
|
||||
"name": "sharp",
|
||||
"description": "High performance Node.js image processing, the fastest module to resize JPEG, PNG, WebP, GIF, AVIF and TIFF images",
|
||||
"version": "0.33.5",
|
||||
"author": "Lovell Fuller <npm@lovell.info>",
|
||||
"homepage": "https://sharp.pixelplumbing.com",
|
||||
"contributors": [
|
||||
"Pierre Inglebert <pierre.inglebert@gmail.com>",
|
||||
"Jonathan Ong <jonathanrichardong@gmail.com>",
|
||||
"Chanon Sajjamanochai <chanon.s@gmail.com>",
|
||||
"Juliano Julio <julianojulio@gmail.com>",
|
||||
"Daniel Gasienica <daniel@gasienica.ch>",
|
||||
"Julian Walker <julian@fiftythree.com>",
|
||||
"Amit Pitaru <pitaru.amit@gmail.com>",
|
||||
"Brandon Aaron <hello.brandon@aaron.sh>",
|
||||
"Andreas Lind <andreas@one.com>",
|
||||
"Maurus Cuelenaere <mcuelenaere@gmail.com>",
|
||||
"Linus Unnebäck <linus@folkdatorn.se>",
|
||||
"Victor Mateevitsi <mvictoras@gmail.com>",
|
||||
"Alaric Holloway <alaric.holloway@gmail.com>",
|
||||
"Bernhard K. Weisshuhn <bkw@codingforce.com>",
|
||||
"Chris Riley <criley@primedia.com>",
|
||||
"David Carley <dacarley@gmail.com>",
|
||||
"John Tobin <john@limelightmobileinc.com>",
|
||||
"Kenton Gray <kentongray@gmail.com>",
|
||||
"Felix Bünemann <Felix.Buenemann@gmail.com>",
|
||||
"Samy Al Zahrani <samyalzahrany@gmail.com>",
|
||||
"Chintan Thakkar <lemnisk8@gmail.com>",
|
||||
"F. Orlando Galashan <frulo@gmx.de>",
|
||||
"Kleis Auke Wolthuizen <info@kleisauke.nl>",
|
||||
"Matt Hirsch <mhirsch@media.mit.edu>",
|
||||
"Matthias Thoemmes <thoemmes@gmail.com>",
|
||||
"Patrick Paskaris <patrick@paskaris.gr>",
|
||||
"Jérémy Lal <kapouer@melix.org>",
|
||||
"Rahul Nanwani <r.nanwani@gmail.com>",
|
||||
"Alice Monday <alice0meta@gmail.com>",
|
||||
"Kristo Jorgenson <kristo.jorgenson@gmail.com>",
|
||||
"YvesBos <yves_bos@outlook.com>",
|
||||
"Guy Maliar <guy@tailorbrands.com>",
|
||||
"Nicolas Coden <nicolas@ncoden.fr>",
|
||||
"Matt Parrish <matt.r.parrish@gmail.com>",
|
||||
"Marcel Bretschneider <marcel.bretschneider@gmail.com>",
|
||||
"Matthew McEachen <matthew+github@mceachen.org>",
|
||||
"Jarda Kotěšovec <jarda.kotesovec@gmail.com>",
|
||||
"Kenric D'Souza <kenric.dsouza@gmail.com>",
|
||||
"Oleh Aleinyk <oleg.aleynik@gmail.com>",
|
||||
"Marcel Bretschneider <marcel.bretschneider@gmail.com>",
|
||||
"Andrea Bianco <andrea.bianco@unibas.ch>",
|
||||
"Rik Heywood <rik@rik.org>",
|
||||
"Thomas Parisot <hi@oncletom.io>",
|
||||
"Nathan Graves <nathanrgraves+github@gmail.com>",
|
||||
"Tom Lokhorst <tom@lokhorst.eu>",
|
||||
"Espen Hovlandsdal <espen@hovlandsdal.com>",
|
||||
"Sylvain Dumont <sylvain.dumont35@gmail.com>",
|
||||
"Alun Davies <alun.owain.davies@googlemail.com>",
|
||||
"Aidan Hoolachan <ajhoolachan21@gmail.com>",
|
||||
"Axel Eirola <axel.eirola@iki.fi>",
|
||||
"Freezy <freezy@xbmc.org>",
|
||||
"Daiz <taneli.vatanen@gmail.com>",
|
||||
"Julian Aubourg <j@ubourg.net>",
|
||||
"Keith Belovay <keith@picthrive.com>",
|
||||
"Michael B. Klein <mbklein@gmail.com>",
|
||||
"Jordan Prudhomme <jordan@raboland.fr>",
|
||||
"Ilya Ovdin <iovdin@gmail.com>",
|
||||
"Andargor <andargor@yahoo.com>",
|
||||
"Paul Neave <paul.neave@gmail.com>",
|
||||
"Brendan Kennedy <brenwken@gmail.com>",
|
||||
"Brychan Bennett-Odlum <git@brychan.io>",
|
||||
"Edward Silverton <e.silverton@gmail.com>",
|
||||
"Roman Malieiev <aromaleev@gmail.com>",
|
||||
"Tomas Szabo <tomas.szabo@deftomat.com>",
|
||||
"Robert O'Rourke <robert@o-rourke.org>",
|
||||
"Guillermo Alfonso Varela Chouciño <guillevch@gmail.com>",
|
||||
"Christian Flintrup <chr@gigahost.dk>",
|
||||
"Manan Jadhav <manan@motionden.com>",
|
||||
"Leon Radley <leon@radley.se>",
|
||||
"alza54 <alza54@thiocod.in>",
|
||||
"Jacob Smith <jacob@frende.me>",
|
||||
"Michael Nutt <michael@nutt.im>",
|
||||
"Brad Parham <baparham@gmail.com>",
|
||||
"Taneli Vatanen <taneli.vatanen@gmail.com>",
|
||||
"Joris Dugué <zaruike10@gmail.com>",
|
||||
"Chris Banks <christopher.bradley.banks@gmail.com>",
|
||||
"Ompal Singh <ompal.hitm09@gmail.com>",
|
||||
"Brodan <christopher.hranj@gmail.com>",
|
||||
"Ankur Parihar <ankur.github@gmail.com>",
|
||||
"Brahim Ait elhaj <brahima@gmail.com>",
|
||||
"Mart Jansink <m.jansink@gmail.com>",
|
||||
"Lachlan Newman <lachnewman007@gmail.com>",
|
||||
"Dennis Beatty <dennis@dcbeatty.com>",
|
||||
"Ingvar Stepanyan <me@rreverser.com>",
|
||||
"Don Denton <don@happycollision.com>"
|
||||
],
|
||||
"scripts": {
|
||||
"install": "node install/check",
|
||||
"clean": "rm -rf src/build/ .nyc_output/ coverage/ test/fixtures/output.*",
|
||||
"test": "npm run test-lint && npm run test-unit && npm run test-licensing && npm run test-types",
|
||||
"test-lint": "semistandard && cpplint",
|
||||
"test-unit": "nyc --reporter=lcov --reporter=text --check-coverage --branches=100 mocha",
|
||||
"test-licensing": "license-checker --production --summary --onlyAllow=\"Apache-2.0;BSD;ISC;LGPL-3.0-or-later;MIT\"",
|
||||
"test-leak": "./test/leak/leak.sh",
|
||||
"test-types": "tsd",
|
||||
"package-from-local-build": "node npm/from-local-build",
|
||||
"package-from-github-release": "node npm/from-github-release",
|
||||
"docs-build": "node docs/build && node docs/search-index/build",
|
||||
"docs-serve": "cd docs && npx serve",
|
||||
"docs-publish": "cd docs && npx firebase-tools deploy --project pixelplumbing --only hosting:pixelplumbing-sharp"
|
||||
},
|
||||
"type": "commonjs",
|
||||
"main": "lib/index.js",
|
||||
"types": "lib/index.d.ts",
|
||||
"files": [
|
||||
"install",
|
||||
"lib",
|
||||
"src/*.{cc,h,gyp}"
|
||||
],
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/lovell/sharp.git"
|
||||
},
|
||||
"keywords": [
|
||||
"jpeg",
|
||||
"png",
|
||||
"webp",
|
||||
"avif",
|
||||
"tiff",
|
||||
"gif",
|
||||
"svg",
|
||||
"jp2",
|
||||
"dzi",
|
||||
"image",
|
||||
"resize",
|
||||
"thumbnail",
|
||||
"crop",
|
||||
"embed",
|
||||
"libvips",
|
||||
"vips"
|
||||
],
|
||||
"dependencies": {
|
||||
"color": "^4.2.3",
|
||||
"detect-libc": "^2.0.3",
|
||||
"semver": "^7.6.3"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"@img/sharp-darwin-arm64": "0.33.5",
|
||||
"@img/sharp-darwin-x64": "0.33.5",
|
||||
"@img/sharp-libvips-darwin-arm64": "1.0.4",
|
||||
"@img/sharp-libvips-darwin-x64": "1.0.4",
|
||||
"@img/sharp-libvips-linux-arm": "1.0.5",
|
||||
"@img/sharp-libvips-linux-arm64": "1.0.4",
|
||||
"@img/sharp-libvips-linux-s390x": "1.0.4",
|
||||
"@img/sharp-libvips-linux-x64": "1.0.4",
|
||||
"@img/sharp-libvips-linuxmusl-arm64": "1.0.4",
|
||||
"@img/sharp-libvips-linuxmusl-x64": "1.0.4",
|
||||
"@img/sharp-linux-arm": "0.33.5",
|
||||
"@img/sharp-linux-arm64": "0.33.5",
|
||||
"@img/sharp-linux-s390x": "0.33.5",
|
||||
"@img/sharp-linux-x64": "0.33.5",
|
||||
"@img/sharp-linuxmusl-arm64": "0.33.5",
|
||||
"@img/sharp-linuxmusl-x64": "0.33.5",
|
||||
"@img/sharp-wasm32": "0.33.5",
|
||||
"@img/sharp-win32-ia32": "0.33.5",
|
||||
"@img/sharp-win32-x64": "0.33.5"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@emnapi/runtime": "^1.2.0",
|
||||
"@img/sharp-libvips-dev": "1.0.4",
|
||||
"@img/sharp-libvips-dev-wasm32": "1.0.5",
|
||||
"@img/sharp-libvips-win32-ia32": "1.0.4",
|
||||
"@img/sharp-libvips-win32-x64": "1.0.4",
|
||||
"@types/node": "*",
|
||||
"async": "^3.2.5",
|
||||
"cc": "^3.0.1",
|
||||
"emnapi": "^1.2.0",
|
||||
"exif-reader": "^2.0.1",
|
||||
"extract-zip": "^2.0.1",
|
||||
"icc": "^3.0.0",
|
||||
"jsdoc-to-markdown": "^8.0.3",
|
||||
"license-checker": "^25.0.1",
|
||||
"mocha": "^10.7.3",
|
||||
"node-addon-api": "^8.1.0",
|
||||
"nyc": "^17.0.0",
|
||||
"prebuild": "^13.0.1",
|
||||
"semistandard": "^17.0.0",
|
||||
"tar-fs": "^3.0.6",
|
||||
"tsd": "^0.31.1"
|
||||
},
|
||||
"license": "Apache-2.0",
|
||||
"engines": {
|
||||
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||
},
|
||||
"config": {
|
||||
"libvips": ">=8.15.3"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
},
|
||||
"binary": {
|
||||
"napi_versions": [
|
||||
9
|
||||
]
|
||||
},
|
||||
"semistandard": {
|
||||
"env": [
|
||||
"mocha"
|
||||
]
|
||||
},
|
||||
"cc": {
|
||||
"linelength": "120",
|
||||
"filter": [
|
||||
"build/include"
|
||||
]
|
||||
},
|
||||
"nyc": {
|
||||
"include": [
|
||||
"lib"
|
||||
]
|
||||
},
|
||||
"tsd": {
|
||||
"directory": "test/types/"
|
||||
}
|
||||
}
|
280
node_modules/sharp/src/binding.gyp
generated
vendored
Normal file
280
node_modules/sharp/src/binding.gyp
generated
vendored
Normal file
@@ -0,0 +1,280 @@
|
||||
# Copyright 2013 Lovell Fuller and others.
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
{
|
||||
'variables': {
|
||||
'vips_version': '<!(node -p "require(\'../lib/libvips\').minimumLibvipsVersion")',
|
||||
'platform_and_arch': '<!(node -p "require(\'../lib/libvips\').buildPlatformArch()")',
|
||||
'sharp_libvips_version': '<!(node -p "require(\'../package.json\').optionalDependencies[\'@img/sharp-libvips-<(platform_and_arch)\']")',
|
||||
'sharp_libvips_yarn_locator': '<!(node -p "require(\'../lib/libvips\').yarnLocator()")',
|
||||
'sharp_libvips_include_dir': '<!(node -p "require(\'../lib/libvips\').buildSharpLibvipsIncludeDir()")',
|
||||
'sharp_libvips_cplusplus_dir': '<!(node -p "require(\'../lib/libvips\').buildSharpLibvipsCPlusPlusDir()")',
|
||||
'sharp_libvips_lib_dir': '<!(node -p "require(\'../lib/libvips\').buildSharpLibvipsLibDir()")'
|
||||
},
|
||||
'targets': [{
|
||||
'target_name': 'libvips-cpp',
|
||||
'conditions': [
|
||||
['OS == "win"', {
|
||||
# Build libvips C++ binding for Windows due to MSVC std library ABI changes
|
||||
'type': 'shared_library',
|
||||
'defines': [
|
||||
'VIPS_CPLUSPLUS_EXPORTS',
|
||||
'_ALLOW_KEYWORD_MACROS'
|
||||
],
|
||||
'sources': [
|
||||
'<(sharp_libvips_cplusplus_dir)/VConnection.cpp',
|
||||
'<(sharp_libvips_cplusplus_dir)/VError.cpp',
|
||||
'<(sharp_libvips_cplusplus_dir)/VImage.cpp',
|
||||
'<(sharp_libvips_cplusplus_dir)/VInterpolate.cpp',
|
||||
'<(sharp_libvips_cplusplus_dir)/VRegion.cpp'
|
||||
],
|
||||
'include_dirs': [
|
||||
'<(sharp_libvips_include_dir)',
|
||||
'<(sharp_libvips_include_dir)/glib-2.0',
|
||||
'<(sharp_libvips_lib_dir)/glib-2.0/include'
|
||||
],
|
||||
'link_settings': {
|
||||
'library_dirs': [
|
||||
'<(sharp_libvips_lib_dir)'
|
||||
],
|
||||
'libraries': [
|
||||
'libvips.lib'
|
||||
],
|
||||
},
|
||||
'configurations': {
|
||||
'Release': {
|
||||
'msvs_settings': {
|
||||
'VCCLCompilerTool': {
|
||||
'ExceptionHandling': 1,
|
||||
'Optimization': 1,
|
||||
'WholeProgramOptimization': 'true'
|
||||
},
|
||||
'VCLibrarianTool': {
|
||||
'AdditionalOptions': [
|
||||
'/LTCG:INCREMENTAL'
|
||||
]
|
||||
},
|
||||
'VCLinkerTool': {
|
||||
'ImageHasSafeExceptionHandlers': 'false',
|
||||
'OptimizeReferences': 2,
|
||||
'EnableCOMDATFolding': 2,
|
||||
'LinkIncremental': 1,
|
||||
'AdditionalOptions': [
|
||||
'/LTCG:INCREMENTAL'
|
||||
]
|
||||
}
|
||||
},
|
||||
'msvs_disabled_warnings': [
|
||||
4275
|
||||
]
|
||||
}
|
||||
}
|
||||
}, {
|
||||
# Ignore this target for non-Windows
|
||||
'type': 'none'
|
||||
}]
|
||||
]
|
||||
}, {
|
||||
'target_name': 'sharp-<(platform_and_arch)',
|
||||
'defines': [
|
||||
'NAPI_VERSION=9',
|
||||
'NODE_ADDON_API_DISABLE_DEPRECATED',
|
||||
'NODE_API_SWALLOW_UNTHROWABLE_EXCEPTIONS'
|
||||
],
|
||||
'dependencies': [
|
||||
'<!(node -p "require(\'node-addon-api\').gyp")',
|
||||
'libvips-cpp'
|
||||
],
|
||||
'variables': {
|
||||
'conditions': [
|
||||
['OS != "win"', {
|
||||
'pkg_config_path': '<!(node -p "require(\'../lib/libvips\').pkgConfigPath()")',
|
||||
'use_global_libvips': '<!(node -p "Boolean(require(\'../lib/libvips\').useGlobalLibvips()).toString()")'
|
||||
}, {
|
||||
'pkg_config_path': '',
|
||||
'use_global_libvips': ''
|
||||
}]
|
||||
]
|
||||
},
|
||||
'sources': [
|
||||
'common.cc',
|
||||
'metadata.cc',
|
||||
'stats.cc',
|
||||
'operations.cc',
|
||||
'pipeline.cc',
|
||||
'utilities.cc',
|
||||
'sharp.cc'
|
||||
],
|
||||
'include_dirs': [
|
||||
'<!(node -p "require(\'node-addon-api\').include_dir")',
|
||||
],
|
||||
'conditions': [
|
||||
['use_global_libvips == "true"', {
|
||||
# Use pkg-config for include and lib
|
||||
'include_dirs': ['<!@(PKG_CONFIG_PATH="<(pkg_config_path)" pkg-config --cflags-only-I vips-cpp vips glib-2.0 | sed s\/-I//g)'],
|
||||
'libraries': ['<!@(PKG_CONFIG_PATH="<(pkg_config_path)" pkg-config --libs vips-cpp)'],
|
||||
'defines': [
|
||||
'SHARP_USE_GLOBAL_LIBVIPS'
|
||||
],
|
||||
'conditions': [
|
||||
['OS == "linux"', {
|
||||
'defines': [
|
||||
# Inspect libvips-cpp.so to determine which C++11 ABI version was used and set _GLIBCXX_USE_CXX11_ABI accordingly. This is quite horrible.
|
||||
'_GLIBCXX_USE_CXX11_ABI=<!(if readelf -Ws "$(PKG_CONFIG_PATH="<(pkg_config_path)" pkg-config --variable libdir vips-cpp)/libvips-cpp.so" | c++filt | grep -qF __cxx11;then echo "1";else echo "0";fi)'
|
||||
]
|
||||
}]
|
||||
]
|
||||
}, {
|
||||
# Use pre-built libvips stored locally within node_modules
|
||||
'include_dirs': [
|
||||
'<(sharp_libvips_include_dir)',
|
||||
'<(sharp_libvips_include_dir)/glib-2.0',
|
||||
'<(sharp_libvips_lib_dir)/glib-2.0/include'
|
||||
],
|
||||
'library_dirs': [
|
||||
'<(sharp_libvips_lib_dir)'
|
||||
],
|
||||
'conditions': [
|
||||
['OS == "win"', {
|
||||
'defines': [
|
||||
'_ALLOW_KEYWORD_MACROS',
|
||||
'_FILE_OFFSET_BITS=64'
|
||||
],
|
||||
'link_settings': {
|
||||
'libraries': [
|
||||
'libvips.lib'
|
||||
]
|
||||
}
|
||||
}],
|
||||
['OS == "mac"', {
|
||||
'link_settings': {
|
||||
'libraries': [
|
||||
'libvips-cpp.42.dylib'
|
||||
]
|
||||
},
|
||||
'xcode_settings': {
|
||||
'OTHER_LDFLAGS': [
|
||||
# Ensure runtime linking is relative to sharp.node
|
||||
'-Wl,-rpath,\'@loader_path/../../sharp-libvips-<(platform_and_arch)/lib\'',
|
||||
'-Wl,-rpath,\'@loader_path/../../../sharp-libvips-<(platform_and_arch)/<(sharp_libvips_version)/lib\'',
|
||||
'-Wl,-rpath,\'@loader_path/../../node_modules/@img/sharp-libvips-<(platform_and_arch)/lib\'',
|
||||
'-Wl,-rpath,\'@loader_path/../../../node_modules/@img/sharp-libvips-<(platform_and_arch)/lib\'',
|
||||
'-Wl,-rpath,\'@loader_path/../../../../../@img-sharp-libvips-<(platform_and_arch)-npm-<(sharp_libvips_version)-<(sharp_libvips_yarn_locator)/node_modules/@img/sharp-libvips-<(platform_and_arch)/lib\''
|
||||
]
|
||||
}
|
||||
}],
|
||||
['OS == "linux"', {
|
||||
'defines': [
|
||||
'_GLIBCXX_USE_CXX11_ABI=1'
|
||||
],
|
||||
'link_settings': {
|
||||
'libraries': [
|
||||
'-l:libvips-cpp.so.42'
|
||||
],
|
||||
'ldflags': [
|
||||
'-Wl,-s',
|
||||
'-Wl,--disable-new-dtags',
|
||||
'-Wl,-z,nodelete',
|
||||
'-Wl,-rpath=\'$$ORIGIN/../../sharp-libvips-<(platform_and_arch)/lib\'',
|
||||
'-Wl,-rpath=\'$$ORIGIN/../../../sharp-libvips-<(platform_and_arch)/<(sharp_libvips_version)/lib\'',
|
||||
'-Wl,-rpath=\'$$ORIGIN/../../node_modules/@img/sharp-libvips-<(platform_and_arch)/lib\'',
|
||||
'-Wl,-rpath=\'$$ORIGIN/../../../node_modules/@img/sharp-libvips-<(platform_and_arch)/lib\'',
|
||||
'-Wl,-rpath,\'$$ORIGIN/../../../../../@img-sharp-libvips-<(platform_and_arch)-npm-<(sharp_libvips_version)-<(sharp_libvips_yarn_locator)/node_modules/@img/sharp-libvips-<(platform_and_arch)/lib\''
|
||||
]
|
||||
}
|
||||
}],
|
||||
['OS == "emscripten"', {
|
||||
'product_extension': 'node.js',
|
||||
'link_settings': {
|
||||
'ldflags': [
|
||||
'-fexceptions',
|
||||
'--pre-js=<!(node -p "require.resolve(\'./emscripten/pre.js\')")',
|
||||
'-Oz',
|
||||
'-sALLOW_MEMORY_GROWTH',
|
||||
'-sENVIRONMENT=node',
|
||||
'-sEXPORTED_FUNCTIONS=["emnapiInit", "_vips_shutdown", "_uv_library_shutdown"]',
|
||||
'-sNODERAWFS',
|
||||
'-sTEXTDECODER=0',
|
||||
'-sWASM_ASYNC_COMPILATION=0',
|
||||
'-sWASM_BIGINT'
|
||||
],
|
||||
'libraries': [
|
||||
'<!@(PKG_CONFIG_PATH="<!(node -p "require(\'@img/sharp-libvips-dev-wasm32/lib\')")/pkgconfig" pkg-config --static --libs vips-cpp)'
|
||||
],
|
||||
}
|
||||
}]
|
||||
]
|
||||
}]
|
||||
],
|
||||
'cflags_cc': [
|
||||
'-std=c++0x',
|
||||
'-fexceptions',
|
||||
'-Wall',
|
||||
'-Os'
|
||||
],
|
||||
'xcode_settings': {
|
||||
'CLANG_CXX_LANGUAGE_STANDARD': 'c++11',
|
||||
'MACOSX_DEPLOYMENT_TARGET': '10.13',
|
||||
'GCC_ENABLE_CPP_EXCEPTIONS': 'YES',
|
||||
'GCC_ENABLE_CPP_RTTI': 'YES',
|
||||
'OTHER_CPLUSPLUSFLAGS': [
|
||||
'-fexceptions',
|
||||
'-Wall',
|
||||
'-Oz'
|
||||
]
|
||||
},
|
||||
'configurations': {
|
||||
'Release': {
|
||||
'conditions': [
|
||||
['target_arch == "arm"', {
|
||||
'cflags_cc': [
|
||||
'-Wno-psabi'
|
||||
]
|
||||
}],
|
||||
['OS == "win"', {
|
||||
'msvs_settings': {
|
||||
'VCCLCompilerTool': {
|
||||
'ExceptionHandling': 1,
|
||||
'Optimization': 1,
|
||||
'WholeProgramOptimization': 'true'
|
||||
},
|
||||
'VCLibrarianTool': {
|
||||
'AdditionalOptions': [
|
||||
'/LTCG:INCREMENTAL'
|
||||
]
|
||||
},
|
||||
'VCLinkerTool': {
|
||||
'ImageHasSafeExceptionHandlers': 'false',
|
||||
'OptimizeReferences': 2,
|
||||
'EnableCOMDATFolding': 2,
|
||||
'LinkIncremental': 1,
|
||||
'AdditionalOptions': [
|
||||
'/LTCG:INCREMENTAL'
|
||||
]
|
||||
}
|
||||
},
|
||||
'msvs_disabled_warnings': [
|
||||
4275
|
||||
]
|
||||
}]
|
||||
]
|
||||
}
|
||||
},
|
||||
}, {
|
||||
'target_name': 'copy-dll',
|
||||
'type': 'none',
|
||||
'dependencies': [
|
||||
'sharp-<(platform_and_arch)'
|
||||
],
|
||||
'conditions': [
|
||||
['OS == "win"', {
|
||||
'copies': [{
|
||||
'destination': 'build/Release',
|
||||
'files': [
|
||||
'<(sharp_libvips_lib_dir)/libvips-42.dll'
|
||||
]
|
||||
}]
|
||||
}]
|
||||
]
|
||||
}]
|
||||
}
|
1091
node_modules/sharp/src/common.cc
generated
vendored
Normal file
1091
node_modules/sharp/src/common.cc
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
393
node_modules/sharp/src/common.h
generated
vendored
Normal file
393
node_modules/sharp/src/common.h
generated
vendored
Normal file
@@ -0,0 +1,393 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
#ifndef SRC_COMMON_H_
|
||||
#define SRC_COMMON_H_
|
||||
|
||||
#include <string>
|
||||
#include <tuple>
|
||||
#include <vector>
|
||||
#include <atomic>
|
||||
|
||||
#include <napi.h>
|
||||
#include <vips/vips8>
|
||||
|
||||
// Verify platform and compiler compatibility
|
||||
|
||||
#if (VIPS_MAJOR_VERSION < 8) || \
|
||||
(VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION < 15) || \
|
||||
(VIPS_MAJOR_VERSION == 8 && VIPS_MINOR_VERSION == 15 && VIPS_MICRO_VERSION < 3)
|
||||
#error "libvips version 8.15.3+ is required - please see https://sharp.pixelplumbing.com/install"
|
||||
#endif
|
||||
|
||||
#if ((!defined(__clang__)) && defined(__GNUC__) && (__GNUC__ < 4 || (__GNUC__ == 4 && __GNUC_MINOR__ < 6)))
|
||||
#error "GCC version 4.6+ is required for C++11 features - please see https://sharp.pixelplumbing.com/install"
|
||||
#endif
|
||||
|
||||
#if (defined(__clang__) && defined(__has_feature))
|
||||
#if (!__has_feature(cxx_range_for))
|
||||
#error "clang version 3.0+ is required for C++11 features - please see https://sharp.pixelplumbing.com/install"
|
||||
#endif
|
||||
#endif
|
||||
|
||||
using vips::VImage;
|
||||
|
||||
namespace sharp {
|
||||
|
||||
struct InputDescriptor { // NOLINT(runtime/indentation_namespace)
|
||||
std::string name;
|
||||
std::string file;
|
||||
char *buffer;
|
||||
VipsFailOn failOn;
|
||||
uint64_t limitInputPixels;
|
||||
bool unlimited;
|
||||
VipsAccess access;
|
||||
size_t bufferLength;
|
||||
bool isBuffer;
|
||||
double density;
|
||||
bool ignoreIcc;
|
||||
VipsBandFormat rawDepth;
|
||||
int rawChannels;
|
||||
int rawWidth;
|
||||
int rawHeight;
|
||||
bool rawPremultiplied;
|
||||
int pages;
|
||||
int page;
|
||||
int level;
|
||||
int subifd;
|
||||
int createChannels;
|
||||
int createWidth;
|
||||
int createHeight;
|
||||
std::vector<double> createBackground;
|
||||
std::string createNoiseType;
|
||||
double createNoiseMean;
|
||||
double createNoiseSigma;
|
||||
std::string textValue;
|
||||
std::string textFont;
|
||||
std::string textFontfile;
|
||||
int textWidth;
|
||||
int textHeight;
|
||||
VipsAlign textAlign;
|
||||
bool textJustify;
|
||||
int textDpi;
|
||||
bool textRgba;
|
||||
int textSpacing;
|
||||
VipsTextWrap textWrap;
|
||||
int textAutofitDpi;
|
||||
|
||||
InputDescriptor():
|
||||
buffer(nullptr),
|
||||
failOn(VIPS_FAIL_ON_WARNING),
|
||||
limitInputPixels(0x3FFF * 0x3FFF),
|
||||
unlimited(false),
|
||||
access(VIPS_ACCESS_RANDOM),
|
||||
bufferLength(0),
|
||||
isBuffer(false),
|
||||
density(72.0),
|
||||
ignoreIcc(false),
|
||||
rawDepth(VIPS_FORMAT_UCHAR),
|
||||
rawChannels(0),
|
||||
rawWidth(0),
|
||||
rawHeight(0),
|
||||
rawPremultiplied(false),
|
||||
pages(1),
|
||||
page(0),
|
||||
level(0),
|
||||
subifd(-1),
|
||||
createChannels(0),
|
||||
createWidth(0),
|
||||
createHeight(0),
|
||||
createBackground{ 0.0, 0.0, 0.0, 255.0 },
|
||||
createNoiseMean(0.0),
|
||||
createNoiseSigma(0.0),
|
||||
textWidth(0),
|
||||
textHeight(0),
|
||||
textAlign(VIPS_ALIGN_LOW),
|
||||
textJustify(false),
|
||||
textDpi(72),
|
||||
textRgba(false),
|
||||
textSpacing(0),
|
||||
textWrap(VIPS_TEXT_WRAP_WORD),
|
||||
textAutofitDpi(0) {}
|
||||
};
|
||||
|
||||
// Convenience methods to access the attributes of a Napi::Object
|
||||
bool HasAttr(Napi::Object obj, std::string attr);
|
||||
std::string AttrAsStr(Napi::Object obj, std::string attr);
|
||||
std::string AttrAsStr(Napi::Object obj, unsigned int const attr);
|
||||
uint32_t AttrAsUint32(Napi::Object obj, std::string attr);
|
||||
int32_t AttrAsInt32(Napi::Object obj, std::string attr);
|
||||
int32_t AttrAsInt32(Napi::Object obj, unsigned int const attr);
|
||||
double AttrAsDouble(Napi::Object obj, std::string attr);
|
||||
double AttrAsDouble(Napi::Object obj, unsigned int const attr);
|
||||
bool AttrAsBool(Napi::Object obj, std::string attr);
|
||||
std::vector<double> AttrAsVectorOfDouble(Napi::Object obj, std::string attr);
|
||||
std::vector<int32_t> AttrAsInt32Vector(Napi::Object obj, std::string attr);
|
||||
template <class T> T AttrAsEnum(Napi::Object obj, std::string attr, GType type) {
|
||||
return static_cast<T>(
|
||||
vips_enum_from_nick(nullptr, type, AttrAsStr(obj, attr).data()));
|
||||
}
|
||||
|
||||
// Create an InputDescriptor instance from a Napi::Object describing an input image
|
||||
InputDescriptor* CreateInputDescriptor(Napi::Object input);
|
||||
|
||||
enum class ImageType {
|
||||
JPEG,
|
||||
PNG,
|
||||
WEBP,
|
||||
JP2,
|
||||
TIFF,
|
||||
GIF,
|
||||
SVG,
|
||||
HEIF,
|
||||
PDF,
|
||||
MAGICK,
|
||||
OPENSLIDE,
|
||||
PPM,
|
||||
FITS,
|
||||
EXR,
|
||||
JXL,
|
||||
VIPS,
|
||||
RAW,
|
||||
UNKNOWN,
|
||||
MISSING
|
||||
};
|
||||
|
||||
enum class Canvas {
|
||||
CROP,
|
||||
EMBED,
|
||||
MAX,
|
||||
MIN,
|
||||
IGNORE_ASPECT
|
||||
};
|
||||
|
||||
// How many tasks are in the queue?
|
||||
extern std::atomic<int> counterQueue;
|
||||
|
||||
// How many tasks are being processed?
|
||||
extern std::atomic<int> counterProcess;
|
||||
|
||||
// Filename extension checkers
|
||||
bool IsJpeg(std::string const &str);
|
||||
bool IsPng(std::string const &str);
|
||||
bool IsWebp(std::string const &str);
|
||||
bool IsJp2(std::string const &str);
|
||||
bool IsGif(std::string const &str);
|
||||
bool IsTiff(std::string const &str);
|
||||
bool IsHeic(std::string const &str);
|
||||
bool IsHeif(std::string const &str);
|
||||
bool IsAvif(std::string const &str);
|
||||
bool IsJxl(std::string const &str);
|
||||
bool IsDz(std::string const &str);
|
||||
bool IsDzZip(std::string const &str);
|
||||
bool IsV(std::string const &str);
|
||||
|
||||
/*
|
||||
Trim space from end of string.
|
||||
*/
|
||||
std::string TrimEnd(std::string const &str);
|
||||
|
||||
/*
|
||||
Provide a string identifier for the given image type.
|
||||
*/
|
||||
std::string ImageTypeId(ImageType const imageType);
|
||||
|
||||
/*
|
||||
Determine image format of a buffer.
|
||||
*/
|
||||
ImageType DetermineImageType(void *buffer, size_t const length);
|
||||
|
||||
/*
|
||||
Determine image format of a file.
|
||||
*/
|
||||
ImageType DetermineImageType(char const *file);
|
||||
|
||||
/*
|
||||
Does this image type support multiple pages?
|
||||
*/
|
||||
bool ImageTypeSupportsPage(ImageType imageType);
|
||||
|
||||
/*
|
||||
Does this image type support removal of safety limits?
|
||||
*/
|
||||
bool ImageTypeSupportsUnlimited(ImageType imageType);
|
||||
|
||||
/*
|
||||
Open an image from the given InputDescriptor (filesystem, compressed buffer, raw pixel data)
|
||||
*/
|
||||
std::tuple<VImage, ImageType> OpenInput(InputDescriptor *descriptor);
|
||||
|
||||
/*
|
||||
Does this image have an embedded profile?
|
||||
*/
|
||||
bool HasProfile(VImage image);
|
||||
|
||||
/*
|
||||
Get copy of embedded profile.
|
||||
*/
|
||||
std::pair<char*, size_t> GetProfile(VImage image);
|
||||
|
||||
/*
|
||||
Set embedded profile.
|
||||
*/
|
||||
VImage SetProfile(VImage image, std::pair<char*, size_t> icc);
|
||||
|
||||
/*
|
||||
Does this image have an alpha channel?
|
||||
Uses colour space interpretation with number of channels to guess this.
|
||||
*/
|
||||
bool HasAlpha(VImage image);
|
||||
|
||||
/*
|
||||
Remove all EXIF-related image fields.
|
||||
*/
|
||||
VImage RemoveExif(VImage image);
|
||||
|
||||
/*
|
||||
Get EXIF Orientation of image, if any.
|
||||
*/
|
||||
int ExifOrientation(VImage image);
|
||||
|
||||
/*
|
||||
Set EXIF Orientation of image.
|
||||
*/
|
||||
VImage SetExifOrientation(VImage image, int const orientation);
|
||||
|
||||
/*
|
||||
Remove EXIF Orientation from image.
|
||||
*/
|
||||
VImage RemoveExifOrientation(VImage image);
|
||||
|
||||
/*
|
||||
Set animation properties if necessary.
|
||||
*/
|
||||
VImage SetAnimationProperties(VImage image, int nPages, int pageHeight, std::vector<int> delay, int loop);
|
||||
|
||||
/*
|
||||
Remove animation properties from image.
|
||||
*/
|
||||
VImage RemoveAnimationProperties(VImage image);
|
||||
|
||||
/*
|
||||
Remove GIF palette from image.
|
||||
*/
|
||||
VImage RemoveGifPalette(VImage image);
|
||||
|
||||
/*
|
||||
Does this image have a non-default density?
|
||||
*/
|
||||
bool HasDensity(VImage image);
|
||||
|
||||
/*
|
||||
Get pixels/mm resolution as pixels/inch density.
|
||||
*/
|
||||
int GetDensity(VImage image);
|
||||
|
||||
/*
|
||||
Set pixels/mm resolution based on a pixels/inch density.
|
||||
*/
|
||||
VImage SetDensity(VImage image, const double density);
|
||||
|
||||
/*
|
||||
Multi-page images can have a page height. Fetch it, and sanity check it.
|
||||
If page-height is not set, it defaults to the image height
|
||||
*/
|
||||
int GetPageHeight(VImage image);
|
||||
|
||||
/*
|
||||
Check the proposed format supports the current dimensions.
|
||||
*/
|
||||
void AssertImageTypeDimensions(VImage image, ImageType const imageType);
|
||||
|
||||
/*
|
||||
Called when a Buffer undergoes GC, required to support mixed runtime libraries in Windows
|
||||
*/
|
||||
extern std::function<void(void*, char*)> FreeCallback;
|
||||
|
||||
/*
|
||||
Called with warnings from the glib-registered "VIPS" domain
|
||||
*/
|
||||
void VipsWarningCallback(char const* log_domain, GLogLevelFlags log_level, char const* message, void* ignore);
|
||||
|
||||
/*
|
||||
Pop the oldest warning message from the queue
|
||||
*/
|
||||
std::string VipsWarningPop();
|
||||
|
||||
/*
|
||||
Attach an event listener for progress updates, used to detect timeout
|
||||
*/
|
||||
void SetTimeout(VImage image, int const timeoutSeconds);
|
||||
|
||||
/*
|
||||
Event listener for progress updates, used to detect timeout
|
||||
*/
|
||||
void VipsProgressCallBack(VipsImage *image, VipsProgress *progress, int *timeoutSeconds);
|
||||
|
||||
/*
|
||||
Calculate the (left, top) coordinates of the output image
|
||||
within the input image, applying the given gravity during an embed.
|
||||
*/
|
||||
std::tuple<int, int> CalculateEmbedPosition(int const inWidth, int const inHeight,
|
||||
int const outWidth, int const outHeight, int const gravity);
|
||||
|
||||
/*
|
||||
Calculate the (left, top) coordinates of the output image
|
||||
within the input image, applying the given gravity.
|
||||
*/
|
||||
std::tuple<int, int> CalculateCrop(int const inWidth, int const inHeight,
|
||||
int const outWidth, int const outHeight, int const gravity);
|
||||
|
||||
/*
|
||||
Calculate the (left, top) coordinates of the output image
|
||||
within the input image, applying the given x and y offsets of the output image.
|
||||
*/
|
||||
std::tuple<int, int> CalculateCrop(int const inWidth, int const inHeight,
|
||||
int const outWidth, int const outHeight, int const x, int const y);
|
||||
|
||||
/*
|
||||
Are pixel values in this image 16-bit integer?
|
||||
*/
|
||||
bool Is16Bit(VipsInterpretation const interpretation);
|
||||
|
||||
/*
|
||||
Return the image alpha maximum. Useful for combining alpha bands. scRGB
|
||||
images are 0 - 1 for image data, but the alpha is 0 - 255.
|
||||
*/
|
||||
double MaximumImageAlpha(VipsInterpretation const interpretation);
|
||||
|
||||
/*
|
||||
Convert RGBA value to another colourspace
|
||||
*/
|
||||
std::vector<double> GetRgbaAsColourspace(std::vector<double> const rgba,
|
||||
VipsInterpretation const interpretation, bool premultiply);
|
||||
|
||||
/*
|
||||
Apply the alpha channel to a given colour
|
||||
*/
|
||||
std::tuple<VImage, std::vector<double>> ApplyAlpha(VImage image, std::vector<double> colour, bool premultiply);
|
||||
|
||||
/*
|
||||
Removes alpha channel, if any.
|
||||
*/
|
||||
VImage RemoveAlpha(VImage image);
|
||||
|
||||
/*
|
||||
Ensures alpha channel, if missing.
|
||||
*/
|
||||
VImage EnsureAlpha(VImage image, double const value);
|
||||
|
||||
/*
|
||||
Calculate the horizontal and vertical shrink factors, taking the canvas mode into account.
|
||||
*/
|
||||
std::pair<double, double> ResolveShrink(int width, int height, int targetWidth, int targetHeight,
|
||||
Canvas canvas, bool withoutEnlargement, bool withoutReduction);
|
||||
|
||||
/*
|
||||
Ensure decoding remains sequential.
|
||||
*/
|
||||
VImage StaySequential(VImage image, bool condition = true);
|
||||
|
||||
} // namespace sharp
|
||||
|
||||
#endif // SRC_COMMON_H_
|
320
node_modules/sharp/src/metadata.cc
generated
vendored
Normal file
320
node_modules/sharp/src/metadata.cc
generated
vendored
Normal file
@@ -0,0 +1,320 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
#include <numeric>
|
||||
#include <vector>
|
||||
|
||||
#include <napi.h>
|
||||
#include <vips/vips8>
|
||||
|
||||
#include "common.h"
|
||||
#include "metadata.h"
|
||||
|
||||
static void* readPNGComment(VipsImage *image, const char *field, GValue *value, void *p);
|
||||
|
||||
class MetadataWorker : public Napi::AsyncWorker {
|
||||
public:
|
||||
MetadataWorker(Napi::Function callback, MetadataBaton *baton, Napi::Function debuglog) :
|
||||
Napi::AsyncWorker(callback), baton(baton), debuglog(Napi::Persistent(debuglog)) {}
|
||||
~MetadataWorker() {}
|
||||
|
||||
void Execute() {
|
||||
// Decrement queued task counter
|
||||
sharp::counterQueue--;
|
||||
|
||||
vips::VImage image;
|
||||
sharp::ImageType imageType = sharp::ImageType::UNKNOWN;
|
||||
try {
|
||||
std::tie(image, imageType) = OpenInput(baton->input);
|
||||
} catch (vips::VError const &err) {
|
||||
(baton->err).append(err.what());
|
||||
}
|
||||
if (imageType != sharp::ImageType::UNKNOWN) {
|
||||
// Image type
|
||||
baton->format = sharp::ImageTypeId(imageType);
|
||||
// VipsImage attributes
|
||||
baton->width = image.width();
|
||||
baton->height = image.height();
|
||||
baton->space = vips_enum_nick(VIPS_TYPE_INTERPRETATION, image.interpretation());
|
||||
baton->channels = image.bands();
|
||||
baton->depth = vips_enum_nick(VIPS_TYPE_BAND_FORMAT, image.format());
|
||||
if (sharp::HasDensity(image)) {
|
||||
baton->density = sharp::GetDensity(image);
|
||||
}
|
||||
if (image.get_typeof("jpeg-chroma-subsample") == VIPS_TYPE_REF_STRING) {
|
||||
baton->chromaSubsampling = image.get_string("jpeg-chroma-subsample");
|
||||
}
|
||||
if (image.get_typeof("interlaced") == G_TYPE_INT) {
|
||||
baton->isProgressive = image.get_int("interlaced") == 1;
|
||||
}
|
||||
if (image.get_typeof("palette-bit-depth") == G_TYPE_INT) {
|
||||
baton->paletteBitDepth = image.get_int("palette-bit-depth");
|
||||
}
|
||||
if (image.get_typeof(VIPS_META_N_PAGES) == G_TYPE_INT) {
|
||||
baton->pages = image.get_int(VIPS_META_N_PAGES);
|
||||
}
|
||||
if (image.get_typeof(VIPS_META_PAGE_HEIGHT) == G_TYPE_INT) {
|
||||
baton->pageHeight = image.get_int(VIPS_META_PAGE_HEIGHT);
|
||||
}
|
||||
if (image.get_typeof("loop") == G_TYPE_INT) {
|
||||
baton->loop = image.get_int("loop");
|
||||
}
|
||||
if (image.get_typeof("delay") == VIPS_TYPE_ARRAY_INT) {
|
||||
baton->delay = image.get_array_int("delay");
|
||||
}
|
||||
if (image.get_typeof("heif-primary") == G_TYPE_INT) {
|
||||
baton->pagePrimary = image.get_int("heif-primary");
|
||||
}
|
||||
if (image.get_typeof("heif-compression") == VIPS_TYPE_REF_STRING) {
|
||||
baton->compression = image.get_string("heif-compression");
|
||||
}
|
||||
if (image.get_typeof(VIPS_META_RESOLUTION_UNIT) == VIPS_TYPE_REF_STRING) {
|
||||
baton->resolutionUnit = image.get_string(VIPS_META_RESOLUTION_UNIT);
|
||||
}
|
||||
if (image.get_typeof("magick-format") == VIPS_TYPE_REF_STRING) {
|
||||
baton->formatMagick = image.get_string("magick-format");
|
||||
}
|
||||
if (image.get_typeof("openslide.level-count") == VIPS_TYPE_REF_STRING) {
|
||||
int const levels = std::stoi(image.get_string("openslide.level-count"));
|
||||
for (int l = 0; l < levels; l++) {
|
||||
std::string prefix = "openslide.level[" + std::to_string(l) + "].";
|
||||
int const width = std::stoi(image.get_string((prefix + "width").data()));
|
||||
int const height = std::stoi(image.get_string((prefix + "height").data()));
|
||||
baton->levels.push_back(std::pair<int, int>(width, height));
|
||||
}
|
||||
}
|
||||
if (image.get_typeof(VIPS_META_N_SUBIFDS) == G_TYPE_INT) {
|
||||
baton->subifds = image.get_int(VIPS_META_N_SUBIFDS);
|
||||
}
|
||||
baton->hasProfile = sharp::HasProfile(image);
|
||||
if (image.get_typeof("background") == VIPS_TYPE_ARRAY_DOUBLE) {
|
||||
baton->background = image.get_array_double("background");
|
||||
}
|
||||
// Derived attributes
|
||||
baton->hasAlpha = sharp::HasAlpha(image);
|
||||
baton->orientation = sharp::ExifOrientation(image);
|
||||
// EXIF
|
||||
if (image.get_typeof(VIPS_META_EXIF_NAME) == VIPS_TYPE_BLOB) {
|
||||
size_t exifLength;
|
||||
void const *exif = image.get_blob(VIPS_META_EXIF_NAME, &exifLength);
|
||||
baton->exif = static_cast<char*>(g_malloc(exifLength));
|
||||
memcpy(baton->exif, exif, exifLength);
|
||||
baton->exifLength = exifLength;
|
||||
}
|
||||
// ICC profile
|
||||
if (image.get_typeof(VIPS_META_ICC_NAME) == VIPS_TYPE_BLOB) {
|
||||
size_t iccLength;
|
||||
void const *icc = image.get_blob(VIPS_META_ICC_NAME, &iccLength);
|
||||
baton->icc = static_cast<char*>(g_malloc(iccLength));
|
||||
memcpy(baton->icc, icc, iccLength);
|
||||
baton->iccLength = iccLength;
|
||||
}
|
||||
// IPTC
|
||||
if (image.get_typeof(VIPS_META_IPTC_NAME) == VIPS_TYPE_BLOB) {
|
||||
size_t iptcLength;
|
||||
void const *iptc = image.get_blob(VIPS_META_IPTC_NAME, &iptcLength);
|
||||
baton->iptc = static_cast<char *>(g_malloc(iptcLength));
|
||||
memcpy(baton->iptc, iptc, iptcLength);
|
||||
baton->iptcLength = iptcLength;
|
||||
}
|
||||
// XMP
|
||||
if (image.get_typeof(VIPS_META_XMP_NAME) == VIPS_TYPE_BLOB) {
|
||||
size_t xmpLength;
|
||||
void const *xmp = image.get_blob(VIPS_META_XMP_NAME, &xmpLength);
|
||||
baton->xmp = static_cast<char *>(g_malloc(xmpLength));
|
||||
memcpy(baton->xmp, xmp, xmpLength);
|
||||
baton->xmpLength = xmpLength;
|
||||
}
|
||||
// TIFFTAG_PHOTOSHOP
|
||||
if (image.get_typeof(VIPS_META_PHOTOSHOP_NAME) == VIPS_TYPE_BLOB) {
|
||||
size_t tifftagPhotoshopLength;
|
||||
void const *tifftagPhotoshop = image.get_blob(VIPS_META_PHOTOSHOP_NAME, &tifftagPhotoshopLength);
|
||||
baton->tifftagPhotoshop = static_cast<char *>(g_malloc(tifftagPhotoshopLength));
|
||||
memcpy(baton->tifftagPhotoshop, tifftagPhotoshop, tifftagPhotoshopLength);
|
||||
baton->tifftagPhotoshopLength = tifftagPhotoshopLength;
|
||||
}
|
||||
// PNG comments
|
||||
vips_image_map(image.get_image(), readPNGComment, &baton->comments);
|
||||
}
|
||||
|
||||
// Clean up
|
||||
vips_error_clear();
|
||||
vips_thread_shutdown();
|
||||
}
|
||||
|
||||
void OnOK() {
|
||||
Napi::Env env = Env();
|
||||
Napi::HandleScope scope(env);
|
||||
|
||||
// Handle warnings
|
||||
std::string warning = sharp::VipsWarningPop();
|
||||
while (!warning.empty()) {
|
||||
debuglog.Call(Receiver().Value(), { Napi::String::New(env, warning) });
|
||||
warning = sharp::VipsWarningPop();
|
||||
}
|
||||
|
||||
if (baton->err.empty()) {
|
||||
Napi::Object info = Napi::Object::New(env);
|
||||
info.Set("format", baton->format);
|
||||
if (baton->input->bufferLength > 0) {
|
||||
info.Set("size", baton->input->bufferLength);
|
||||
}
|
||||
info.Set("width", baton->width);
|
||||
info.Set("height", baton->height);
|
||||
info.Set("space", baton->space);
|
||||
info.Set("channels", baton->channels);
|
||||
info.Set("depth", baton->depth);
|
||||
if (baton->density > 0) {
|
||||
info.Set("density", baton->density);
|
||||
}
|
||||
if (!baton->chromaSubsampling.empty()) {
|
||||
info.Set("chromaSubsampling", baton->chromaSubsampling);
|
||||
}
|
||||
info.Set("isProgressive", baton->isProgressive);
|
||||
if (baton->paletteBitDepth > 0) {
|
||||
info.Set("paletteBitDepth", baton->paletteBitDepth);
|
||||
}
|
||||
if (baton->pages > 0) {
|
||||
info.Set("pages", baton->pages);
|
||||
}
|
||||
if (baton->pageHeight > 0) {
|
||||
info.Set("pageHeight", baton->pageHeight);
|
||||
}
|
||||
if (baton->loop >= 0) {
|
||||
info.Set("loop", baton->loop);
|
||||
}
|
||||
if (!baton->delay.empty()) {
|
||||
int i = 0;
|
||||
Napi::Array delay = Napi::Array::New(env, static_cast<size_t>(baton->delay.size()));
|
||||
for (int const d : baton->delay) {
|
||||
delay.Set(i++, d);
|
||||
}
|
||||
info.Set("delay", delay);
|
||||
}
|
||||
if (baton->pagePrimary > -1) {
|
||||
info.Set("pagePrimary", baton->pagePrimary);
|
||||
}
|
||||
if (!baton->compression.empty()) {
|
||||
info.Set("compression", baton->compression);
|
||||
}
|
||||
if (!baton->resolutionUnit.empty()) {
|
||||
info.Set("resolutionUnit", baton->resolutionUnit == "in" ? "inch" : baton->resolutionUnit);
|
||||
}
|
||||
if (!baton->formatMagick.empty()) {
|
||||
info.Set("formatMagick", baton->formatMagick);
|
||||
}
|
||||
if (!baton->levels.empty()) {
|
||||
int i = 0;
|
||||
Napi::Array levels = Napi::Array::New(env, static_cast<size_t>(baton->levels.size()));
|
||||
for (std::pair<int, int> const &l : baton->levels) {
|
||||
Napi::Object level = Napi::Object::New(env);
|
||||
level.Set("width", l.first);
|
||||
level.Set("height", l.second);
|
||||
levels.Set(i++, level);
|
||||
}
|
||||
info.Set("levels", levels);
|
||||
}
|
||||
if (baton->subifds > 0) {
|
||||
info.Set("subifds", baton->subifds);
|
||||
}
|
||||
if (!baton->background.empty()) {
|
||||
if (baton->background.size() == 3) {
|
||||
Napi::Object background = Napi::Object::New(env);
|
||||
background.Set("r", baton->background[0]);
|
||||
background.Set("g", baton->background[1]);
|
||||
background.Set("b", baton->background[2]);
|
||||
info.Set("background", background);
|
||||
} else {
|
||||
info.Set("background", baton->background[0]);
|
||||
}
|
||||
}
|
||||
info.Set("hasProfile", baton->hasProfile);
|
||||
info.Set("hasAlpha", baton->hasAlpha);
|
||||
if (baton->orientation > 0) {
|
||||
info.Set("orientation", baton->orientation);
|
||||
}
|
||||
if (baton->exifLength > 0) {
|
||||
info.Set("exif", Napi::Buffer<char>::NewOrCopy(env, baton->exif, baton->exifLength, sharp::FreeCallback));
|
||||
}
|
||||
if (baton->iccLength > 0) {
|
||||
info.Set("icc", Napi::Buffer<char>::NewOrCopy(env, baton->icc, baton->iccLength, sharp::FreeCallback));
|
||||
}
|
||||
if (baton->iptcLength > 0) {
|
||||
info.Set("iptc", Napi::Buffer<char>::NewOrCopy(env, baton->iptc, baton->iptcLength, sharp::FreeCallback));
|
||||
}
|
||||
if (baton->xmpLength > 0) {
|
||||
info.Set("xmp", Napi::Buffer<char>::NewOrCopy(env, baton->xmp, baton->xmpLength, sharp::FreeCallback));
|
||||
}
|
||||
if (baton->tifftagPhotoshopLength > 0) {
|
||||
info.Set("tifftagPhotoshop",
|
||||
Napi::Buffer<char>::NewOrCopy(env, baton->tifftagPhotoshop,
|
||||
baton->tifftagPhotoshopLength, sharp::FreeCallback));
|
||||
}
|
||||
if (baton->comments.size() > 0) {
|
||||
int i = 0;
|
||||
Napi::Array comments = Napi::Array::New(env, baton->comments.size());
|
||||
for (auto &c : baton->comments) {
|
||||
Napi::Object comment = Napi::Object::New(env);
|
||||
comment.Set("keyword", c.first);
|
||||
comment.Set("text", c.second);
|
||||
comments.Set(i++, comment);
|
||||
}
|
||||
info.Set("comments", comments);
|
||||
}
|
||||
Callback().Call(Receiver().Value(), { env.Null(), info });
|
||||
} else {
|
||||
Callback().Call(Receiver().Value(), { Napi::Error::New(env, sharp::TrimEnd(baton->err)).Value() });
|
||||
}
|
||||
|
||||
delete baton->input;
|
||||
delete baton;
|
||||
}
|
||||
|
||||
private:
|
||||
MetadataBaton* baton;
|
||||
Napi::FunctionReference debuglog;
|
||||
};
|
||||
|
||||
/*
|
||||
metadata(options, callback)
|
||||
*/
|
||||
Napi::Value metadata(const Napi::CallbackInfo& info) {
|
||||
// V8 objects are converted to non-V8 types held in the baton struct
|
||||
MetadataBaton *baton = new MetadataBaton;
|
||||
Napi::Object options = info[size_t(0)].As<Napi::Object>();
|
||||
|
||||
// Input
|
||||
baton->input = sharp::CreateInputDescriptor(options.Get("input").As<Napi::Object>());
|
||||
|
||||
// Function to notify of libvips warnings
|
||||
Napi::Function debuglog = options.Get("debuglog").As<Napi::Function>();
|
||||
|
||||
// Join queue for worker thread
|
||||
Napi::Function callback = info[size_t(1)].As<Napi::Function>();
|
||||
MetadataWorker *worker = new MetadataWorker(callback, baton, debuglog);
|
||||
worker->Receiver().Set("options", options);
|
||||
worker->Queue();
|
||||
|
||||
// Increment queued task counter
|
||||
sharp::counterQueue++;
|
||||
|
||||
return info.Env().Undefined();
|
||||
}
|
||||
|
||||
const char *PNG_COMMENT_START = "png-comment-";
|
||||
const int PNG_COMMENT_START_LEN = strlen(PNG_COMMENT_START);
|
||||
|
||||
static void* readPNGComment(VipsImage *image, const char *field, GValue *value, void *p) {
|
||||
MetadataComments *comments = static_cast<MetadataComments *>(p);
|
||||
|
||||
if (vips_isprefix(PNG_COMMENT_START, field)) {
|
||||
const char *keyword = strchr(field + PNG_COMMENT_START_LEN, '-');
|
||||
const char *str;
|
||||
if (keyword != NULL && !vips_image_get_string(image, field, &str)) {
|
||||
keyword++; // Skip the hyphen
|
||||
comments->push_back(std::make_pair(keyword, str));
|
||||
}
|
||||
}
|
||||
|
||||
return NULL;
|
||||
}
|
85
node_modules/sharp/src/metadata.h
generated
vendored
Normal file
85
node_modules/sharp/src/metadata.h
generated
vendored
Normal file
@@ -0,0 +1,85 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
#ifndef SRC_METADATA_H_
|
||||
#define SRC_METADATA_H_
|
||||
|
||||
#include <string>
|
||||
#include <napi.h>
|
||||
|
||||
#include "./common.h"
|
||||
|
||||
typedef std::vector<std::pair<std::string, std::string>> MetadataComments;
|
||||
|
||||
struct MetadataBaton {
|
||||
// Input
|
||||
sharp::InputDescriptor *input;
|
||||
// Output
|
||||
std::string format;
|
||||
int width;
|
||||
int height;
|
||||
std::string space;
|
||||
int channels;
|
||||
std::string depth;
|
||||
int density;
|
||||
std::string chromaSubsampling;
|
||||
bool isProgressive;
|
||||
int paletteBitDepth;
|
||||
int pages;
|
||||
int pageHeight;
|
||||
int loop;
|
||||
std::vector<int> delay;
|
||||
int pagePrimary;
|
||||
std::string compression;
|
||||
std::string resolutionUnit;
|
||||
std::string formatMagick;
|
||||
std::vector<std::pair<int, int>> levels;
|
||||
int subifds;
|
||||
std::vector<double> background;
|
||||
bool hasProfile;
|
||||
bool hasAlpha;
|
||||
int orientation;
|
||||
char *exif;
|
||||
size_t exifLength;
|
||||
char *icc;
|
||||
size_t iccLength;
|
||||
char *iptc;
|
||||
size_t iptcLength;
|
||||
char *xmp;
|
||||
size_t xmpLength;
|
||||
char *tifftagPhotoshop;
|
||||
size_t tifftagPhotoshopLength;
|
||||
MetadataComments comments;
|
||||
std::string err;
|
||||
|
||||
MetadataBaton():
|
||||
input(nullptr),
|
||||
width(0),
|
||||
height(0),
|
||||
channels(0),
|
||||
density(0),
|
||||
isProgressive(false),
|
||||
paletteBitDepth(0),
|
||||
pages(0),
|
||||
pageHeight(0),
|
||||
loop(-1),
|
||||
pagePrimary(-1),
|
||||
subifds(0),
|
||||
hasProfile(false),
|
||||
hasAlpha(false),
|
||||
orientation(0),
|
||||
exif(nullptr),
|
||||
exifLength(0),
|
||||
icc(nullptr),
|
||||
iccLength(0),
|
||||
iptc(nullptr),
|
||||
iptcLength(0),
|
||||
xmp(nullptr),
|
||||
xmpLength(0),
|
||||
tifftagPhotoshop(nullptr),
|
||||
tifftagPhotoshopLength(0) {}
|
||||
};
|
||||
|
||||
Napi::Value metadata(const Napi::CallbackInfo& info);
|
||||
|
||||
#endif // SRC_METADATA_H_
|
475
node_modules/sharp/src/operations.cc
generated
vendored
Normal file
475
node_modules/sharp/src/operations.cc
generated
vendored
Normal file
@@ -0,0 +1,475 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
#include <algorithm>
|
||||
#include <functional>
|
||||
#include <memory>
|
||||
#include <tuple>
|
||||
#include <vector>
|
||||
#include <vips/vips8>
|
||||
|
||||
#include "common.h"
|
||||
#include "operations.h"
|
||||
|
||||
using vips::VImage;
|
||||
using vips::VError;
|
||||
|
||||
namespace sharp {
|
||||
/*
|
||||
* Tint an image using the provided RGB.
|
||||
*/
|
||||
VImage Tint(VImage image, std::vector<double> const tint) {
|
||||
std::vector<double> const tintLab = (VImage::black(1, 1) + tint)
|
||||
.colourspace(VIPS_INTERPRETATION_LAB, VImage::option()->set("source_space", VIPS_INTERPRETATION_sRGB))
|
||||
.getpoint(0, 0);
|
||||
// LAB identity function
|
||||
VImage identityLab = VImage::identity(VImage::option()->set("bands", 3))
|
||||
.colourspace(VIPS_INTERPRETATION_LAB, VImage::option()->set("source_space", VIPS_INTERPRETATION_sRGB));
|
||||
// Scale luminance range, 0.0 to 1.0
|
||||
VImage l = identityLab[0] / 100;
|
||||
// Weighting functions
|
||||
VImage weightL = 1.0 - 4.0 * ((l - 0.5) * (l - 0.5));
|
||||
VImage weightAB = (weightL * tintLab).extract_band(1, VImage::option()->set("n", 2));
|
||||
identityLab = identityLab[0].bandjoin(weightAB);
|
||||
// Convert lookup table to sRGB
|
||||
VImage lut = identityLab.colourspace(VIPS_INTERPRETATION_sRGB,
|
||||
VImage::option()->set("source_space", VIPS_INTERPRETATION_LAB));
|
||||
// Original colourspace
|
||||
VipsInterpretation typeBeforeTint = image.interpretation();
|
||||
if (typeBeforeTint == VIPS_INTERPRETATION_RGB) {
|
||||
typeBeforeTint = VIPS_INTERPRETATION_sRGB;
|
||||
}
|
||||
// Apply lookup table
|
||||
if (HasAlpha(image)) {
|
||||
VImage alpha = image[image.bands() - 1];
|
||||
image = RemoveAlpha(image)
|
||||
.colourspace(VIPS_INTERPRETATION_B_W)
|
||||
.maplut(lut)
|
||||
.colourspace(typeBeforeTint)
|
||||
.bandjoin(alpha);
|
||||
} else {
|
||||
image = image
|
||||
.colourspace(VIPS_INTERPRETATION_B_W)
|
||||
.maplut(lut)
|
||||
.colourspace(typeBeforeTint);
|
||||
}
|
||||
return image;
|
||||
}
|
||||
|
||||
/*
|
||||
* Stretch luminance to cover full dynamic range.
|
||||
*/
|
||||
VImage Normalise(VImage image, int const lower, int const upper) {
|
||||
// Get original colourspace
|
||||
VipsInterpretation typeBeforeNormalize = image.interpretation();
|
||||
if (typeBeforeNormalize == VIPS_INTERPRETATION_RGB) {
|
||||
typeBeforeNormalize = VIPS_INTERPRETATION_sRGB;
|
||||
}
|
||||
// Convert to LAB colourspace
|
||||
VImage lab = image.colourspace(VIPS_INTERPRETATION_LAB);
|
||||
// Extract luminance
|
||||
VImage luminance = lab[0];
|
||||
|
||||
// Find luminance range
|
||||
int const min = lower == 0 ? luminance.min() : luminance.percent(lower);
|
||||
int const max = upper == 100 ? luminance.max() : luminance.percent(upper);
|
||||
|
||||
if (std::abs(max - min) > 1) {
|
||||
// Extract chroma
|
||||
VImage chroma = lab.extract_band(1, VImage::option()->set("n", 2));
|
||||
// Calculate multiplication factor and addition
|
||||
double f = 100.0 / (max - min);
|
||||
double a = -(min * f);
|
||||
// Scale luminance, join to chroma, convert back to original colourspace
|
||||
VImage normalized = luminance.linear(f, a).bandjoin(chroma).colourspace(typeBeforeNormalize);
|
||||
// Attach original alpha channel, if any
|
||||
if (HasAlpha(image)) {
|
||||
// Extract original alpha channel
|
||||
VImage alpha = image[image.bands() - 1];
|
||||
// Join alpha channel to normalised image
|
||||
return normalized.bandjoin(alpha);
|
||||
} else {
|
||||
return normalized;
|
||||
}
|
||||
}
|
||||
return image;
|
||||
}
|
||||
|
||||
/*
|
||||
* Contrast limiting adapative histogram equalization (CLAHE)
|
||||
*/
|
||||
VImage Clahe(VImage image, int const width, int const height, int const maxSlope) {
|
||||
return image.hist_local(width, height, VImage::option()->set("max_slope", maxSlope));
|
||||
}
|
||||
|
||||
/*
|
||||
* Gamma encoding/decoding
|
||||
*/
|
||||
VImage Gamma(VImage image, double const exponent) {
|
||||
if (HasAlpha(image)) {
|
||||
// Separate alpha channel
|
||||
VImage alpha = image[image.bands() - 1];
|
||||
return RemoveAlpha(image).gamma(VImage::option()->set("exponent", exponent)).bandjoin(alpha);
|
||||
} else {
|
||||
return image.gamma(VImage::option()->set("exponent", exponent));
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* Flatten image to remove alpha channel
|
||||
*/
|
||||
VImage Flatten(VImage image, std::vector<double> flattenBackground) {
|
||||
double const multiplier = sharp::Is16Bit(image.interpretation()) ? 256.0 : 1.0;
|
||||
std::vector<double> background {
|
||||
flattenBackground[0] * multiplier,
|
||||
flattenBackground[1] * multiplier,
|
||||
flattenBackground[2] * multiplier
|
||||
};
|
||||
return image.flatten(VImage::option()->set("background", background));
|
||||
}
|
||||
|
||||
/**
|
||||
* Produce the "negative" of the image.
|
||||
*/
|
||||
VImage Negate(VImage image, bool const negateAlpha) {
|
||||
if (HasAlpha(image) && !negateAlpha) {
|
||||
// Separate alpha channel
|
||||
VImage alpha = image[image.bands() - 1];
|
||||
return RemoveAlpha(image).invert().bandjoin(alpha);
|
||||
} else {
|
||||
return image.invert();
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* Gaussian blur. Use sigma of -1.0 for fast blur.
|
||||
*/
|
||||
VImage Blur(VImage image, double const sigma, VipsPrecision precision, double const minAmpl) {
|
||||
if (sigma == -1.0) {
|
||||
// Fast, mild blur - averages neighbouring pixels
|
||||
VImage blur = VImage::new_matrixv(3, 3,
|
||||
1.0, 1.0, 1.0,
|
||||
1.0, 1.0, 1.0,
|
||||
1.0, 1.0, 1.0);
|
||||
blur.set("scale", 9.0);
|
||||
return image.conv(blur);
|
||||
} else {
|
||||
// Slower, accurate Gaussian blur
|
||||
return StaySequential(image).gaussblur(sigma, VImage::option()
|
||||
->set("precision", precision)
|
||||
->set("min_ampl", minAmpl));
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* Convolution with a kernel.
|
||||
*/
|
||||
VImage Convolve(VImage image, int const width, int const height,
|
||||
double const scale, double const offset,
|
||||
std::vector<double> const &kernel_v
|
||||
) {
|
||||
VImage kernel = VImage::new_from_memory(
|
||||
static_cast<void*>(const_cast<double*>(kernel_v.data())),
|
||||
width * height * sizeof(double),
|
||||
width,
|
||||
height,
|
||||
1,
|
||||
VIPS_FORMAT_DOUBLE);
|
||||
kernel.set("scale", scale);
|
||||
kernel.set("offset", offset);
|
||||
|
||||
return image.conv(kernel);
|
||||
}
|
||||
|
||||
/*
|
||||
* Recomb with a Matrix of the given bands/channel size.
|
||||
* Eg. RGB will be a 3x3 matrix.
|
||||
*/
|
||||
VImage Recomb(VImage image, std::vector<double> const& matrix) {
|
||||
double* m = const_cast<double*>(matrix.data());
|
||||
image = image.colourspace(VIPS_INTERPRETATION_sRGB);
|
||||
if (matrix.size() == 9) {
|
||||
return image
|
||||
.recomb(image.bands() == 3
|
||||
? VImage::new_matrix(3, 3, m, 9)
|
||||
: VImage::new_matrixv(4, 4,
|
||||
m[0], m[1], m[2], 0.0,
|
||||
m[3], m[4], m[5], 0.0,
|
||||
m[6], m[7], m[8], 0.0,
|
||||
0.0, 0.0, 0.0, 1.0));
|
||||
} else {
|
||||
return image.recomb(VImage::new_matrix(4, 4, m, 16));
|
||||
}
|
||||
}
|
||||
|
||||
VImage Modulate(VImage image, double const brightness, double const saturation,
|
||||
int const hue, double const lightness) {
|
||||
VipsInterpretation colourspaceBeforeModulate = image.interpretation();
|
||||
if (HasAlpha(image)) {
|
||||
// Separate alpha channel
|
||||
VImage alpha = image[image.bands() - 1];
|
||||
return RemoveAlpha(image)
|
||||
.colourspace(VIPS_INTERPRETATION_LCH)
|
||||
.linear(
|
||||
{ brightness, saturation, 1},
|
||||
{ lightness, 0.0, static_cast<double>(hue) }
|
||||
)
|
||||
.colourspace(colourspaceBeforeModulate)
|
||||
.bandjoin(alpha);
|
||||
} else {
|
||||
return image
|
||||
.colourspace(VIPS_INTERPRETATION_LCH)
|
||||
.linear(
|
||||
{ brightness, saturation, 1 },
|
||||
{ lightness, 0.0, static_cast<double>(hue) }
|
||||
)
|
||||
.colourspace(colourspaceBeforeModulate);
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* Sharpen flat and jagged areas. Use sigma of -1.0 for fast sharpen.
|
||||
*/
|
||||
VImage Sharpen(VImage image, double const sigma, double const m1, double const m2,
|
||||
double const x1, double const y2, double const y3) {
|
||||
if (sigma == -1.0) {
|
||||
// Fast, mild sharpen
|
||||
VImage sharpen = VImage::new_matrixv(3, 3,
|
||||
-1.0, -1.0, -1.0,
|
||||
-1.0, 32.0, -1.0,
|
||||
-1.0, -1.0, -1.0);
|
||||
sharpen.set("scale", 24.0);
|
||||
return image.conv(sharpen);
|
||||
} else {
|
||||
// Slow, accurate sharpen in LAB colour space, with control over flat vs jagged areas
|
||||
VipsInterpretation colourspaceBeforeSharpen = image.interpretation();
|
||||
if (colourspaceBeforeSharpen == VIPS_INTERPRETATION_RGB) {
|
||||
colourspaceBeforeSharpen = VIPS_INTERPRETATION_sRGB;
|
||||
}
|
||||
return image
|
||||
.sharpen(VImage::option()
|
||||
->set("sigma", sigma)
|
||||
->set("m1", m1)
|
||||
->set("m2", m2)
|
||||
->set("x1", x1)
|
||||
->set("y2", y2)
|
||||
->set("y3", y3))
|
||||
.colourspace(colourspaceBeforeSharpen);
|
||||
}
|
||||
}
|
||||
|
||||
VImage Threshold(VImage image, double const threshold, bool const thresholdGrayscale) {
|
||||
if (!thresholdGrayscale) {
|
||||
return image >= threshold;
|
||||
}
|
||||
return image.colourspace(VIPS_INTERPRETATION_B_W) >= threshold;
|
||||
}
|
||||
|
||||
/*
|
||||
Perform boolean/bitwise operation on image color channels - results in one channel image
|
||||
*/
|
||||
VImage Bandbool(VImage image, VipsOperationBoolean const boolean) {
|
||||
image = image.bandbool(boolean);
|
||||
return image.copy(VImage::option()->set("interpretation", VIPS_INTERPRETATION_B_W));
|
||||
}
|
||||
|
||||
/*
|
||||
Perform bitwise boolean operation between images
|
||||
*/
|
||||
VImage Boolean(VImage image, VImage imageR, VipsOperationBoolean const boolean) {
|
||||
return image.boolean(imageR, boolean);
|
||||
}
|
||||
|
||||
/*
|
||||
Trim an image
|
||||
*/
|
||||
VImage Trim(VImage image, std::vector<double> background, double threshold, bool const lineArt) {
|
||||
if (image.width() < 3 && image.height() < 3) {
|
||||
throw VError("Image to trim must be at least 3x3 pixels");
|
||||
}
|
||||
if (background.size() == 0) {
|
||||
// Top-left pixel provides the default background colour if none is given
|
||||
background = image.extract_area(0, 0, 1, 1)(0, 0);
|
||||
} else if (sharp::Is16Bit(image.interpretation())) {
|
||||
for (size_t i = 0; i < background.size(); i++) {
|
||||
background[i] *= 256.0;
|
||||
}
|
||||
threshold *= 256.0;
|
||||
}
|
||||
std::vector<double> backgroundAlpha({ background.back() });
|
||||
if (HasAlpha(image)) {
|
||||
background.pop_back();
|
||||
} else {
|
||||
background.resize(image.bands());
|
||||
}
|
||||
int left, top, width, height;
|
||||
left = image.find_trim(&top, &width, &height, VImage::option()
|
||||
->set("background", background)
|
||||
->set("line_art", lineArt)
|
||||
->set("threshold", threshold));
|
||||
if (HasAlpha(image)) {
|
||||
// Search alpha channel (A)
|
||||
int leftA, topA, widthA, heightA;
|
||||
VImage alpha = image[image.bands() - 1];
|
||||
leftA = alpha.find_trim(&topA, &widthA, &heightA, VImage::option()
|
||||
->set("background", backgroundAlpha)
|
||||
->set("line_art", lineArt)
|
||||
->set("threshold", threshold));
|
||||
if (widthA > 0 && heightA > 0) {
|
||||
if (width > 0 && height > 0) {
|
||||
// Combined bounding box (B)
|
||||
int const leftB = std::min(left, leftA);
|
||||
int const topB = std::min(top, topA);
|
||||
int const widthB = std::max(left + width, leftA + widthA) - leftB;
|
||||
int const heightB = std::max(top + height, topA + heightA) - topB;
|
||||
return image.extract_area(leftB, topB, widthB, heightB);
|
||||
} else {
|
||||
// Use alpha only
|
||||
return image.extract_area(leftA, topA, widthA, heightA);
|
||||
}
|
||||
}
|
||||
}
|
||||
if (width > 0 && height > 0) {
|
||||
return image.extract_area(left, top, width, height);
|
||||
}
|
||||
return image;
|
||||
}
|
||||
|
||||
/*
|
||||
* Calculate (a * in + b)
|
||||
*/
|
||||
VImage Linear(VImage image, std::vector<double> const a, std::vector<double> const b) {
|
||||
size_t const bands = static_cast<size_t>(image.bands());
|
||||
if (a.size() > bands) {
|
||||
throw VError("Band expansion using linear is unsupported");
|
||||
}
|
||||
bool const uchar = !Is16Bit(image.interpretation());
|
||||
if (HasAlpha(image) && a.size() != bands && (a.size() == 1 || a.size() == bands - 1 || bands - 1 == 1)) {
|
||||
// Separate alpha channel
|
||||
VImage alpha = image[bands - 1];
|
||||
return RemoveAlpha(image).linear(a, b, VImage::option()->set("uchar", uchar)).bandjoin(alpha);
|
||||
} else {
|
||||
return image.linear(a, b, VImage::option()->set("uchar", uchar));
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* Unflatten
|
||||
*/
|
||||
VImage Unflatten(VImage image) {
|
||||
if (HasAlpha(image)) {
|
||||
VImage alpha = image[image.bands() - 1];
|
||||
VImage noAlpha = RemoveAlpha(image);
|
||||
return noAlpha.bandjoin(alpha & (noAlpha.colourspace(VIPS_INTERPRETATION_B_W) < 255));
|
||||
} else {
|
||||
return image.bandjoin(image.colourspace(VIPS_INTERPRETATION_B_W) < 255);
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* Ensure the image is in a given colourspace
|
||||
*/
|
||||
VImage EnsureColourspace(VImage image, VipsInterpretation colourspace) {
|
||||
if (colourspace != VIPS_INTERPRETATION_LAST && image.interpretation() != colourspace) {
|
||||
image = image.colourspace(colourspace,
|
||||
VImage::option()->set("source_space", image.interpretation()));
|
||||
}
|
||||
return image;
|
||||
}
|
||||
|
||||
/*
|
||||
* Split and crop each frame, reassemble, and update pageHeight.
|
||||
*/
|
||||
VImage CropMultiPage(VImage image, int left, int top, int width, int height,
|
||||
int nPages, int *pageHeight) {
|
||||
if (top == 0 && height == *pageHeight) {
|
||||
// Fast path; no need to adjust the height of the multi-page image
|
||||
return image.extract_area(left, 0, width, image.height());
|
||||
} else {
|
||||
std::vector<VImage> pages;
|
||||
pages.reserve(nPages);
|
||||
|
||||
// Split the image into cropped frames
|
||||
image = StaySequential(image);
|
||||
for (int i = 0; i < nPages; i++) {
|
||||
pages.push_back(
|
||||
image.extract_area(left, *pageHeight * i + top, width, height));
|
||||
}
|
||||
|
||||
// Reassemble the frames into a tall, thin image
|
||||
VImage assembled = VImage::arrayjoin(pages,
|
||||
VImage::option()->set("across", 1));
|
||||
|
||||
// Update the page height
|
||||
*pageHeight = height;
|
||||
|
||||
return assembled;
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* Split into frames, embed each frame, reassemble, and update pageHeight.
|
||||
*/
|
||||
VImage EmbedMultiPage(VImage image, int left, int top, int width, int height,
|
||||
VipsExtend extendWith, std::vector<double> background, int nPages, int *pageHeight) {
|
||||
if (top == 0 && height == *pageHeight) {
|
||||
// Fast path; no need to adjust the height of the multi-page image
|
||||
return image.embed(left, 0, width, image.height(), VImage::option()
|
||||
->set("extend", extendWith)
|
||||
->set("background", background));
|
||||
} else if (left == 0 && width == image.width()) {
|
||||
// Fast path; no need to adjust the width of the multi-page image
|
||||
std::vector<VImage> pages;
|
||||
pages.reserve(nPages);
|
||||
|
||||
// Rearrange the tall image into a vertical grid
|
||||
image = image.grid(*pageHeight, nPages, 1);
|
||||
|
||||
// Do the embed on the wide image
|
||||
image = image.embed(0, top, image.width(), height, VImage::option()
|
||||
->set("extend", extendWith)
|
||||
->set("background", background));
|
||||
|
||||
// Split the wide image into frames
|
||||
for (int i = 0; i < nPages; i++) {
|
||||
pages.push_back(
|
||||
image.extract_area(width * i, 0, width, height));
|
||||
}
|
||||
|
||||
// Reassemble the frames into a tall, thin image
|
||||
VImage assembled = VImage::arrayjoin(pages,
|
||||
VImage::option()->set("across", 1));
|
||||
|
||||
// Update the page height
|
||||
*pageHeight = height;
|
||||
|
||||
return assembled;
|
||||
} else {
|
||||
std::vector<VImage> pages;
|
||||
pages.reserve(nPages);
|
||||
|
||||
// Split the image into frames
|
||||
for (int i = 0; i < nPages; i++) {
|
||||
pages.push_back(
|
||||
image.extract_area(0, *pageHeight * i, image.width(), *pageHeight));
|
||||
}
|
||||
|
||||
// Embed each frame in the target size
|
||||
for (int i = 0; i < nPages; i++) {
|
||||
pages[i] = pages[i].embed(left, top, width, height, VImage::option()
|
||||
->set("extend", extendWith)
|
||||
->set("background", background));
|
||||
}
|
||||
|
||||
// Reassemble the frames into a tall, thin image
|
||||
VImage assembled = VImage::arrayjoin(pages,
|
||||
VImage::option()->set("across", 1));
|
||||
|
||||
// Update the page height
|
||||
*pageHeight = height;
|
||||
|
||||
return assembled;
|
||||
}
|
||||
}
|
||||
|
||||
} // namespace sharp
|
125
node_modules/sharp/src/operations.h
generated
vendored
Normal file
125
node_modules/sharp/src/operations.h
generated
vendored
Normal file
@@ -0,0 +1,125 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
#ifndef SRC_OPERATIONS_H_
|
||||
#define SRC_OPERATIONS_H_
|
||||
|
||||
#include <algorithm>
|
||||
#include <functional>
|
||||
#include <memory>
|
||||
#include <tuple>
|
||||
#include <vips/vips8>
|
||||
|
||||
using vips::VImage;
|
||||
|
||||
namespace sharp {
|
||||
|
||||
/*
|
||||
* Tint an image using the provided RGB.
|
||||
*/
|
||||
VImage Tint(VImage image, std::vector<double> const tint);
|
||||
|
||||
/*
|
||||
* Stretch luminance to cover full dynamic range.
|
||||
*/
|
||||
VImage Normalise(VImage image, int const lower, int const upper);
|
||||
|
||||
/*
|
||||
* Contrast limiting adapative histogram equalization (CLAHE)
|
||||
*/
|
||||
VImage Clahe(VImage image, int const width, int const height, int const maxSlope);
|
||||
|
||||
/*
|
||||
* Gamma encoding/decoding
|
||||
*/
|
||||
VImage Gamma(VImage image, double const exponent);
|
||||
|
||||
/*
|
||||
* Flatten image to remove alpha channel
|
||||
*/
|
||||
VImage Flatten(VImage image, std::vector<double> flattenBackground);
|
||||
|
||||
/*
|
||||
* Produce the "negative" of the image.
|
||||
*/
|
||||
VImage Negate(VImage image, bool const negateAlpha);
|
||||
|
||||
/*
|
||||
* Gaussian blur. Use sigma of -1.0 for fast blur.
|
||||
*/
|
||||
VImage Blur(VImage image, double const sigma, VipsPrecision precision, double const minAmpl);
|
||||
|
||||
/*
|
||||
* Convolution with a kernel.
|
||||
*/
|
||||
VImage Convolve(VImage image, int const width, int const height,
|
||||
double const scale, double const offset, std::vector<double> const &kernel_v);
|
||||
|
||||
/*
|
||||
* Sharpen flat and jagged areas. Use sigma of -1.0 for fast sharpen.
|
||||
*/
|
||||
VImage Sharpen(VImage image, double const sigma, double const m1, double const m2,
|
||||
double const x1, double const y2, double const y3);
|
||||
|
||||
/*
|
||||
Threshold an image
|
||||
*/
|
||||
VImage Threshold(VImage image, double const threshold, bool const thresholdColor);
|
||||
|
||||
/*
|
||||
Perform boolean/bitwise operation on image color channels - results in one channel image
|
||||
*/
|
||||
VImage Bandbool(VImage image, VipsOperationBoolean const boolean);
|
||||
|
||||
/*
|
||||
Perform bitwise boolean operation between images
|
||||
*/
|
||||
VImage Boolean(VImage image, VImage imageR, VipsOperationBoolean const boolean);
|
||||
|
||||
/*
|
||||
Trim an image
|
||||
*/
|
||||
VImage Trim(VImage image, std::vector<double> background, double threshold, bool const lineArt);
|
||||
|
||||
/*
|
||||
* Linear adjustment (a * in + b)
|
||||
*/
|
||||
VImage Linear(VImage image, std::vector<double> const a, std::vector<double> const b);
|
||||
|
||||
/*
|
||||
* Unflatten
|
||||
*/
|
||||
VImage Unflatten(VImage image);
|
||||
|
||||
/*
|
||||
* Recomb with a Matrix of the given bands/channel size.
|
||||
* Eg. RGB will be a 3x3 matrix.
|
||||
*/
|
||||
VImage Recomb(VImage image, std::vector<double> const &matrix);
|
||||
|
||||
/*
|
||||
* Modulate brightness, saturation, hue and lightness
|
||||
*/
|
||||
VImage Modulate(VImage image, double const brightness, double const saturation,
|
||||
int const hue, double const lightness);
|
||||
|
||||
/*
|
||||
* Ensure the image is in a given colourspace
|
||||
*/
|
||||
VImage EnsureColourspace(VImage image, VipsInterpretation colourspace);
|
||||
|
||||
/*
|
||||
* Split and crop each frame, reassemble, and update pageHeight.
|
||||
*/
|
||||
VImage CropMultiPage(VImage image, int left, int top, int width, int height,
|
||||
int nPages, int *pageHeight);
|
||||
|
||||
/*
|
||||
* Split into frames, embed each frame, reassemble, and update pageHeight.
|
||||
*/
|
||||
VImage EmbedMultiPage(VImage image, int left, int top, int width, int height,
|
||||
VipsExtend extendWith, std::vector<double> background, int nPages, int *pageHeight);
|
||||
|
||||
} // namespace sharp
|
||||
|
||||
#endif // SRC_OPERATIONS_H_
|
1758
node_modules/sharp/src/pipeline.cc
generated
vendored
Normal file
1758
node_modules/sharp/src/pipeline.cc
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
393
node_modules/sharp/src/pipeline.h
generated
vendored
Normal file
393
node_modules/sharp/src/pipeline.h
generated
vendored
Normal file
@@ -0,0 +1,393 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
#ifndef SRC_PIPELINE_H_
|
||||
#define SRC_PIPELINE_H_
|
||||
|
||||
#include <memory>
|
||||
#include <string>
|
||||
#include <vector>
|
||||
#include <unordered_map>
|
||||
|
||||
#include <napi.h>
|
||||
#include <vips/vips8>
|
||||
|
||||
#include "./common.h"
|
||||
|
||||
Napi::Value pipeline(const Napi::CallbackInfo& info);
|
||||
|
||||
struct Composite {
|
||||
sharp::InputDescriptor *input;
|
||||
VipsBlendMode mode;
|
||||
int gravity;
|
||||
int left;
|
||||
int top;
|
||||
bool hasOffset;
|
||||
bool tile;
|
||||
bool premultiplied;
|
||||
|
||||
Composite():
|
||||
input(nullptr),
|
||||
mode(VIPS_BLEND_MODE_OVER),
|
||||
gravity(0),
|
||||
left(0),
|
||||
top(0),
|
||||
hasOffset(false),
|
||||
tile(false),
|
||||
premultiplied(false) {}
|
||||
};
|
||||
|
||||
struct PipelineBaton {
|
||||
sharp::InputDescriptor *input;
|
||||
std::string formatOut;
|
||||
std::string fileOut;
|
||||
void *bufferOut;
|
||||
size_t bufferOutLength;
|
||||
int pageHeightOut;
|
||||
int pagesOut;
|
||||
std::vector<Composite *> composite;
|
||||
std::vector<sharp::InputDescriptor *> joinChannelIn;
|
||||
int topOffsetPre;
|
||||
int leftOffsetPre;
|
||||
int widthPre;
|
||||
int heightPre;
|
||||
int topOffsetPost;
|
||||
int leftOffsetPost;
|
||||
int widthPost;
|
||||
int heightPost;
|
||||
int width;
|
||||
int height;
|
||||
int channels;
|
||||
VipsKernel kernel;
|
||||
sharp::Canvas canvas;
|
||||
int position;
|
||||
std::vector<double> resizeBackground;
|
||||
bool hasCropOffset;
|
||||
int cropOffsetLeft;
|
||||
int cropOffsetTop;
|
||||
bool hasAttentionCenter;
|
||||
int attentionX;
|
||||
int attentionY;
|
||||
bool premultiplied;
|
||||
bool tileCentre;
|
||||
bool fastShrinkOnLoad;
|
||||
std::vector<double> tint;
|
||||
bool flatten;
|
||||
std::vector<double> flattenBackground;
|
||||
bool unflatten;
|
||||
bool negate;
|
||||
bool negateAlpha;
|
||||
double blurSigma;
|
||||
VipsPrecision precision;
|
||||
double minAmpl;
|
||||
double brightness;
|
||||
double saturation;
|
||||
int hue;
|
||||
double lightness;
|
||||
int medianSize;
|
||||
double sharpenSigma;
|
||||
double sharpenM1;
|
||||
double sharpenM2;
|
||||
double sharpenX1;
|
||||
double sharpenY2;
|
||||
double sharpenY3;
|
||||
int threshold;
|
||||
bool thresholdGrayscale;
|
||||
std::vector<double> trimBackground;
|
||||
double trimThreshold;
|
||||
bool trimLineArt;
|
||||
int trimOffsetLeft;
|
||||
int trimOffsetTop;
|
||||
std::vector<double> linearA;
|
||||
std::vector<double> linearB;
|
||||
double gamma;
|
||||
double gammaOut;
|
||||
bool greyscale;
|
||||
bool normalise;
|
||||
int normaliseLower;
|
||||
int normaliseUpper;
|
||||
int claheWidth;
|
||||
int claheHeight;
|
||||
int claheMaxSlope;
|
||||
bool useExifOrientation;
|
||||
int angle;
|
||||
double rotationAngle;
|
||||
std::vector<double> rotationBackground;
|
||||
bool rotateBeforePreExtract;
|
||||
bool flip;
|
||||
bool flop;
|
||||
int extendTop;
|
||||
int extendBottom;
|
||||
int extendLeft;
|
||||
int extendRight;
|
||||
std::vector<double> extendBackground;
|
||||
VipsExtend extendWith;
|
||||
bool withoutEnlargement;
|
||||
bool withoutReduction;
|
||||
std::vector<double> affineMatrix;
|
||||
std::vector<double> affineBackground;
|
||||
double affineIdx;
|
||||
double affineIdy;
|
||||
double affineOdx;
|
||||
double affineOdy;
|
||||
std::string affineInterpolator;
|
||||
int jpegQuality;
|
||||
bool jpegProgressive;
|
||||
std::string jpegChromaSubsampling;
|
||||
bool jpegTrellisQuantisation;
|
||||
int jpegQuantisationTable;
|
||||
bool jpegOvershootDeringing;
|
||||
bool jpegOptimiseScans;
|
||||
bool jpegOptimiseCoding;
|
||||
bool pngProgressive;
|
||||
int pngCompressionLevel;
|
||||
bool pngAdaptiveFiltering;
|
||||
bool pngPalette;
|
||||
int pngQuality;
|
||||
int pngEffort;
|
||||
int pngBitdepth;
|
||||
double pngDither;
|
||||
int jp2Quality;
|
||||
bool jp2Lossless;
|
||||
int jp2TileHeight;
|
||||
int jp2TileWidth;
|
||||
std::string jp2ChromaSubsampling;
|
||||
int webpQuality;
|
||||
int webpAlphaQuality;
|
||||
bool webpNearLossless;
|
||||
bool webpLossless;
|
||||
bool webpSmartSubsample;
|
||||
VipsForeignWebpPreset webpPreset;
|
||||
int webpEffort;
|
||||
bool webpMinSize;
|
||||
bool webpMixed;
|
||||
int gifBitdepth;
|
||||
int gifEffort;
|
||||
double gifDither;
|
||||
double gifInterFrameMaxError;
|
||||
double gifInterPaletteMaxError;
|
||||
bool gifReuse;
|
||||
bool gifProgressive;
|
||||
int tiffQuality;
|
||||
VipsForeignTiffCompression tiffCompression;
|
||||
VipsForeignTiffPredictor tiffPredictor;
|
||||
bool tiffPyramid;
|
||||
int tiffBitdepth;
|
||||
bool tiffMiniswhite;
|
||||
bool tiffTile;
|
||||
int tiffTileHeight;
|
||||
int tiffTileWidth;
|
||||
double tiffXres;
|
||||
double tiffYres;
|
||||
VipsForeignTiffResunit tiffResolutionUnit;
|
||||
int heifQuality;
|
||||
VipsForeignHeifCompression heifCompression;
|
||||
int heifEffort;
|
||||
std::string heifChromaSubsampling;
|
||||
bool heifLossless;
|
||||
int heifBitdepth;
|
||||
double jxlDistance;
|
||||
int jxlDecodingTier;
|
||||
int jxlEffort;
|
||||
bool jxlLossless;
|
||||
VipsBandFormat rawDepth;
|
||||
std::string err;
|
||||
int keepMetadata;
|
||||
int withMetadataOrientation;
|
||||
double withMetadataDensity;
|
||||
std::string withIccProfile;
|
||||
std::unordered_map<std::string, std::string> withExif;
|
||||
bool withExifMerge;
|
||||
int timeoutSeconds;
|
||||
std::vector<double> convKernel;
|
||||
int convKernelWidth;
|
||||
int convKernelHeight;
|
||||
double convKernelScale;
|
||||
double convKernelOffset;
|
||||
sharp::InputDescriptor *boolean;
|
||||
VipsOperationBoolean booleanOp;
|
||||
VipsOperationBoolean bandBoolOp;
|
||||
int extractChannel;
|
||||
bool removeAlpha;
|
||||
double ensureAlpha;
|
||||
VipsInterpretation colourspacePipeline;
|
||||
VipsInterpretation colourspace;
|
||||
std::vector<int> delay;
|
||||
int loop;
|
||||
int tileSize;
|
||||
int tileOverlap;
|
||||
VipsForeignDzContainer tileContainer;
|
||||
VipsForeignDzLayout tileLayout;
|
||||
std::string tileFormat;
|
||||
int tileAngle;
|
||||
std::vector<double> tileBackground;
|
||||
int tileSkipBlanks;
|
||||
VipsForeignDzDepth tileDepth;
|
||||
std::string tileId;
|
||||
std::string tileBasename;
|
||||
std::vector<double> recombMatrix;
|
||||
|
||||
PipelineBaton():
|
||||
input(nullptr),
|
||||
bufferOutLength(0),
|
||||
pageHeightOut(0),
|
||||
pagesOut(0),
|
||||
topOffsetPre(-1),
|
||||
topOffsetPost(-1),
|
||||
channels(0),
|
||||
kernel(VIPS_KERNEL_LANCZOS3),
|
||||
canvas(sharp::Canvas::CROP),
|
||||
position(0),
|
||||
resizeBackground{ 0.0, 0.0, 0.0, 255.0 },
|
||||
hasCropOffset(false),
|
||||
cropOffsetLeft(0),
|
||||
cropOffsetTop(0),
|
||||
hasAttentionCenter(false),
|
||||
attentionX(0),
|
||||
attentionY(0),
|
||||
premultiplied(false),
|
||||
tint{ -1.0, 0.0, 0.0, 0.0 },
|
||||
flatten(false),
|
||||
flattenBackground{ 0.0, 0.0, 0.0 },
|
||||
unflatten(false),
|
||||
negate(false),
|
||||
negateAlpha(true),
|
||||
blurSigma(0.0),
|
||||
brightness(1.0),
|
||||
saturation(1.0),
|
||||
hue(0),
|
||||
lightness(0),
|
||||
medianSize(0),
|
||||
sharpenSigma(0.0),
|
||||
sharpenM1(1.0),
|
||||
sharpenM2(2.0),
|
||||
sharpenX1(2.0),
|
||||
sharpenY2(10.0),
|
||||
sharpenY3(20.0),
|
||||
threshold(0),
|
||||
thresholdGrayscale(true),
|
||||
trimBackground{},
|
||||
trimThreshold(-1.0),
|
||||
trimLineArt(false),
|
||||
trimOffsetLeft(0),
|
||||
trimOffsetTop(0),
|
||||
linearA{},
|
||||
linearB{},
|
||||
gamma(0.0),
|
||||
greyscale(false),
|
||||
normalise(false),
|
||||
normaliseLower(1),
|
||||
normaliseUpper(99),
|
||||
claheWidth(0),
|
||||
claheHeight(0),
|
||||
claheMaxSlope(3),
|
||||
useExifOrientation(false),
|
||||
angle(0),
|
||||
rotationAngle(0.0),
|
||||
rotationBackground{ 0.0, 0.0, 0.0, 255.0 },
|
||||
flip(false),
|
||||
flop(false),
|
||||
extendTop(0),
|
||||
extendBottom(0),
|
||||
extendLeft(0),
|
||||
extendRight(0),
|
||||
extendBackground{ 0.0, 0.0, 0.0, 255.0 },
|
||||
extendWith(VIPS_EXTEND_BACKGROUND),
|
||||
withoutEnlargement(false),
|
||||
withoutReduction(false),
|
||||
affineMatrix{ 1.0, 0.0, 0.0, 1.0 },
|
||||
affineBackground{ 0.0, 0.0, 0.0, 255.0 },
|
||||
affineIdx(0),
|
||||
affineIdy(0),
|
||||
affineOdx(0),
|
||||
affineOdy(0),
|
||||
affineInterpolator("bicubic"),
|
||||
jpegQuality(80),
|
||||
jpegProgressive(false),
|
||||
jpegChromaSubsampling("4:2:0"),
|
||||
jpegTrellisQuantisation(false),
|
||||
jpegQuantisationTable(0),
|
||||
jpegOvershootDeringing(false),
|
||||
jpegOptimiseScans(false),
|
||||
jpegOptimiseCoding(true),
|
||||
pngProgressive(false),
|
||||
pngCompressionLevel(6),
|
||||
pngAdaptiveFiltering(false),
|
||||
pngPalette(false),
|
||||
pngQuality(100),
|
||||
pngEffort(7),
|
||||
pngBitdepth(8),
|
||||
pngDither(1.0),
|
||||
jp2Quality(80),
|
||||
jp2Lossless(false),
|
||||
jp2TileHeight(512),
|
||||
jp2TileWidth(512),
|
||||
jp2ChromaSubsampling("4:4:4"),
|
||||
webpQuality(80),
|
||||
webpAlphaQuality(100),
|
||||
webpNearLossless(false),
|
||||
webpLossless(false),
|
||||
webpSmartSubsample(false),
|
||||
webpPreset(VIPS_FOREIGN_WEBP_PRESET_DEFAULT),
|
||||
webpEffort(4),
|
||||
webpMinSize(false),
|
||||
webpMixed(false),
|
||||
gifBitdepth(8),
|
||||
gifEffort(7),
|
||||
gifDither(1.0),
|
||||
gifInterFrameMaxError(0.0),
|
||||
gifInterPaletteMaxError(3.0),
|
||||
gifReuse(true),
|
||||
gifProgressive(false),
|
||||
tiffQuality(80),
|
||||
tiffCompression(VIPS_FOREIGN_TIFF_COMPRESSION_JPEG),
|
||||
tiffPredictor(VIPS_FOREIGN_TIFF_PREDICTOR_HORIZONTAL),
|
||||
tiffPyramid(false),
|
||||
tiffBitdepth(8),
|
||||
tiffMiniswhite(false),
|
||||
tiffTile(false),
|
||||
tiffTileHeight(256),
|
||||
tiffTileWidth(256),
|
||||
tiffXres(1.0),
|
||||
tiffYres(1.0),
|
||||
tiffResolutionUnit(VIPS_FOREIGN_TIFF_RESUNIT_INCH),
|
||||
heifQuality(50),
|
||||
heifCompression(VIPS_FOREIGN_HEIF_COMPRESSION_AV1),
|
||||
heifEffort(4),
|
||||
heifChromaSubsampling("4:4:4"),
|
||||
heifLossless(false),
|
||||
heifBitdepth(8),
|
||||
jxlDistance(1.0),
|
||||
jxlDecodingTier(0),
|
||||
jxlEffort(7),
|
||||
jxlLossless(false),
|
||||
rawDepth(VIPS_FORMAT_UCHAR),
|
||||
keepMetadata(0),
|
||||
withMetadataOrientation(-1),
|
||||
withMetadataDensity(0.0),
|
||||
withExifMerge(true),
|
||||
timeoutSeconds(0),
|
||||
convKernelWidth(0),
|
||||
convKernelHeight(0),
|
||||
convKernelScale(0.0),
|
||||
convKernelOffset(0.0),
|
||||
boolean(nullptr),
|
||||
booleanOp(VIPS_OPERATION_BOOLEAN_LAST),
|
||||
bandBoolOp(VIPS_OPERATION_BOOLEAN_LAST),
|
||||
extractChannel(-1),
|
||||
removeAlpha(false),
|
||||
ensureAlpha(-1.0),
|
||||
colourspacePipeline(VIPS_INTERPRETATION_LAST),
|
||||
colourspace(VIPS_INTERPRETATION_LAST),
|
||||
loop(-1),
|
||||
tileSize(256),
|
||||
tileOverlap(0),
|
||||
tileContainer(VIPS_FOREIGN_DZ_CONTAINER_FS),
|
||||
tileLayout(VIPS_FOREIGN_DZ_LAYOUT_DZ),
|
||||
tileAngle(0),
|
||||
tileBackground{ 255.0, 255.0, 255.0, 255.0 },
|
||||
tileSkipBlanks(-1),
|
||||
tileDepth(VIPS_FOREIGN_DZ_DEPTH_LAST) {}
|
||||
};
|
||||
|
||||
#endif // SRC_PIPELINE_H_
|
40
node_modules/sharp/src/sharp.cc
generated
vendored
Normal file
40
node_modules/sharp/src/sharp.cc
generated
vendored
Normal file
@@ -0,0 +1,40 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
#include <mutex> // NOLINT(build/c++11)
|
||||
|
||||
#include <napi.h>
|
||||
#include <vips/vips8>
|
||||
|
||||
#include "common.h"
|
||||
#include "metadata.h"
|
||||
#include "pipeline.h"
|
||||
#include "utilities.h"
|
||||
#include "stats.h"
|
||||
|
||||
Napi::Object init(Napi::Env env, Napi::Object exports) {
|
||||
static std::once_flag sharp_vips_init_once;
|
||||
std::call_once(sharp_vips_init_once, []() {
|
||||
vips_init("sharp");
|
||||
});
|
||||
|
||||
g_log_set_handler("VIPS", static_cast<GLogLevelFlags>(G_LOG_LEVEL_WARNING),
|
||||
static_cast<GLogFunc>(sharp::VipsWarningCallback), nullptr);
|
||||
|
||||
// Methods available to JavaScript
|
||||
exports.Set("metadata", Napi::Function::New(env, metadata));
|
||||
exports.Set("pipeline", Napi::Function::New(env, pipeline));
|
||||
exports.Set("cache", Napi::Function::New(env, cache));
|
||||
exports.Set("concurrency", Napi::Function::New(env, concurrency));
|
||||
exports.Set("counters", Napi::Function::New(env, counters));
|
||||
exports.Set("simd", Napi::Function::New(env, simd));
|
||||
exports.Set("libvipsVersion", Napi::Function::New(env, libvipsVersion));
|
||||
exports.Set("format", Napi::Function::New(env, format));
|
||||
exports.Set("block", Napi::Function::New(env, block));
|
||||
exports.Set("_maxColourDistance", Napi::Function::New(env, _maxColourDistance));
|
||||
exports.Set("_isUsingJemalloc", Napi::Function::New(env, _isUsingJemalloc));
|
||||
exports.Set("stats", Napi::Function::New(env, stats));
|
||||
return exports;
|
||||
}
|
||||
|
||||
NODE_API_MODULE(sharp, init)
|
183
node_modules/sharp/src/stats.cc
generated
vendored
Normal file
183
node_modules/sharp/src/stats.cc
generated
vendored
Normal file
@@ -0,0 +1,183 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
#include <numeric>
|
||||
#include <vector>
|
||||
#include <iostream>
|
||||
|
||||
#include <napi.h>
|
||||
#include <vips/vips8>
|
||||
|
||||
#include "common.h"
|
||||
#include "stats.h"
|
||||
|
||||
class StatsWorker : public Napi::AsyncWorker {
|
||||
public:
|
||||
StatsWorker(Napi::Function callback, StatsBaton *baton, Napi::Function debuglog) :
|
||||
Napi::AsyncWorker(callback), baton(baton), debuglog(Napi::Persistent(debuglog)) {}
|
||||
~StatsWorker() {}
|
||||
|
||||
const int STAT_MIN_INDEX = 0;
|
||||
const int STAT_MAX_INDEX = 1;
|
||||
const int STAT_SUM_INDEX = 2;
|
||||
const int STAT_SQ_SUM_INDEX = 3;
|
||||
const int STAT_MEAN_INDEX = 4;
|
||||
const int STAT_STDEV_INDEX = 5;
|
||||
const int STAT_MINX_INDEX = 6;
|
||||
const int STAT_MINY_INDEX = 7;
|
||||
const int STAT_MAXX_INDEX = 8;
|
||||
const int STAT_MAXY_INDEX = 9;
|
||||
|
||||
void Execute() {
|
||||
// Decrement queued task counter
|
||||
sharp::counterQueue--;
|
||||
|
||||
vips::VImage image;
|
||||
sharp::ImageType imageType = sharp::ImageType::UNKNOWN;
|
||||
try {
|
||||
std::tie(image, imageType) = OpenInput(baton->input);
|
||||
} catch (vips::VError const &err) {
|
||||
(baton->err).append(err.what());
|
||||
}
|
||||
if (imageType != sharp::ImageType::UNKNOWN) {
|
||||
try {
|
||||
vips::VImage stats = image.stats();
|
||||
int const bands = image.bands();
|
||||
for (int b = 1; b <= bands; b++) {
|
||||
ChannelStats cStats(
|
||||
static_cast<int>(stats.getpoint(STAT_MIN_INDEX, b).front()),
|
||||
static_cast<int>(stats.getpoint(STAT_MAX_INDEX, b).front()),
|
||||
stats.getpoint(STAT_SUM_INDEX, b).front(),
|
||||
stats.getpoint(STAT_SQ_SUM_INDEX, b).front(),
|
||||
stats.getpoint(STAT_MEAN_INDEX, b).front(),
|
||||
stats.getpoint(STAT_STDEV_INDEX, b).front(),
|
||||
static_cast<int>(stats.getpoint(STAT_MINX_INDEX, b).front()),
|
||||
static_cast<int>(stats.getpoint(STAT_MINY_INDEX, b).front()),
|
||||
static_cast<int>(stats.getpoint(STAT_MAXX_INDEX, b).front()),
|
||||
static_cast<int>(stats.getpoint(STAT_MAXY_INDEX, b).front()));
|
||||
baton->channelStats.push_back(cStats);
|
||||
}
|
||||
// Image is not opaque when alpha layer is present and contains a non-mamixa value
|
||||
if (sharp::HasAlpha(image)) {
|
||||
double const minAlpha = static_cast<double>(stats.getpoint(STAT_MIN_INDEX, bands).front());
|
||||
if (minAlpha != sharp::MaximumImageAlpha(image.interpretation())) {
|
||||
baton->isOpaque = false;
|
||||
}
|
||||
}
|
||||
// Convert to greyscale
|
||||
vips::VImage greyscale = image.colourspace(VIPS_INTERPRETATION_B_W)[0];
|
||||
// Estimate entropy via histogram of greyscale value frequency
|
||||
baton->entropy = std::abs(greyscale.hist_find().hist_entropy());
|
||||
// Estimate sharpness via standard deviation of greyscale laplacian
|
||||
if (image.width() > 1 || image.height() > 1) {
|
||||
VImage laplacian = VImage::new_matrixv(3, 3,
|
||||
0.0, 1.0, 0.0,
|
||||
1.0, -4.0, 1.0,
|
||||
0.0, 1.0, 0.0);
|
||||
laplacian.set("scale", 9.0);
|
||||
baton->sharpness = greyscale.conv(laplacian).deviate();
|
||||
}
|
||||
// Most dominant sRGB colour via 4096-bin 3D histogram
|
||||
vips::VImage hist = sharp::RemoveAlpha(image)
|
||||
.colourspace(VIPS_INTERPRETATION_sRGB)
|
||||
.hist_find_ndim(VImage::option()->set("bins", 16));
|
||||
std::complex<double> maxpos = hist.maxpos();
|
||||
int const dx = static_cast<int>(std::real(maxpos));
|
||||
int const dy = static_cast<int>(std::imag(maxpos));
|
||||
std::vector<double> pel = hist(dx, dy);
|
||||
int const dz = std::distance(pel.begin(), std::find(pel.begin(), pel.end(), hist.max()));
|
||||
baton->dominantRed = dx * 16 + 8;
|
||||
baton->dominantGreen = dy * 16 + 8;
|
||||
baton->dominantBlue = dz * 16 + 8;
|
||||
} catch (vips::VError const &err) {
|
||||
(baton->err).append(err.what());
|
||||
}
|
||||
}
|
||||
|
||||
// Clean up
|
||||
vips_error_clear();
|
||||
vips_thread_shutdown();
|
||||
}
|
||||
|
||||
void OnOK() {
|
||||
Napi::Env env = Env();
|
||||
Napi::HandleScope scope(env);
|
||||
|
||||
// Handle warnings
|
||||
std::string warning = sharp::VipsWarningPop();
|
||||
while (!warning.empty()) {
|
||||
debuglog.Call(Receiver().Value(), { Napi::String::New(env, warning) });
|
||||
warning = sharp::VipsWarningPop();
|
||||
}
|
||||
|
||||
if (baton->err.empty()) {
|
||||
// Stats Object
|
||||
Napi::Object info = Napi::Object::New(env);
|
||||
Napi::Array channels = Napi::Array::New(env);
|
||||
|
||||
std::vector<ChannelStats>::iterator it;
|
||||
int i = 0;
|
||||
for (it = baton->channelStats.begin(); it < baton->channelStats.end(); it++, i++) {
|
||||
Napi::Object channelStat = Napi::Object::New(env);
|
||||
channelStat.Set("min", it->min);
|
||||
channelStat.Set("max", it->max);
|
||||
channelStat.Set("sum", it->sum);
|
||||
channelStat.Set("squaresSum", it->squaresSum);
|
||||
channelStat.Set("mean", it->mean);
|
||||
channelStat.Set("stdev", it->stdev);
|
||||
channelStat.Set("minX", it->minX);
|
||||
channelStat.Set("minY", it->minY);
|
||||
channelStat.Set("maxX", it->maxX);
|
||||
channelStat.Set("maxY", it->maxY);
|
||||
channels.Set(i, channelStat);
|
||||
}
|
||||
|
||||
info.Set("channels", channels);
|
||||
info.Set("isOpaque", baton->isOpaque);
|
||||
info.Set("entropy", baton->entropy);
|
||||
info.Set("sharpness", baton->sharpness);
|
||||
Napi::Object dominant = Napi::Object::New(env);
|
||||
dominant.Set("r", baton->dominantRed);
|
||||
dominant.Set("g", baton->dominantGreen);
|
||||
dominant.Set("b", baton->dominantBlue);
|
||||
info.Set("dominant", dominant);
|
||||
Callback().Call(Receiver().Value(), { env.Null(), info });
|
||||
} else {
|
||||
Callback().Call(Receiver().Value(), { Napi::Error::New(env, sharp::TrimEnd(baton->err)).Value() });
|
||||
}
|
||||
|
||||
delete baton->input;
|
||||
delete baton;
|
||||
}
|
||||
|
||||
private:
|
||||
StatsBaton* baton;
|
||||
Napi::FunctionReference debuglog;
|
||||
};
|
||||
|
||||
/*
|
||||
stats(options, callback)
|
||||
*/
|
||||
Napi::Value stats(const Napi::CallbackInfo& info) {
|
||||
// V8 objects are converted to non-V8 types held in the baton struct
|
||||
StatsBaton *baton = new StatsBaton;
|
||||
Napi::Object options = info[size_t(0)].As<Napi::Object>();
|
||||
|
||||
// Input
|
||||
baton->input = sharp::CreateInputDescriptor(options.Get("input").As<Napi::Object>());
|
||||
baton->input->access = VIPS_ACCESS_RANDOM;
|
||||
|
||||
// Function to notify of libvips warnings
|
||||
Napi::Function debuglog = options.Get("debuglog").As<Napi::Function>();
|
||||
|
||||
// Join queue for worker thread
|
||||
Napi::Function callback = info[size_t(1)].As<Napi::Function>();
|
||||
StatsWorker *worker = new StatsWorker(callback, baton, debuglog);
|
||||
worker->Receiver().Set("options", options);
|
||||
worker->Queue();
|
||||
|
||||
// Increment queued task counter
|
||||
sharp::counterQueue++;
|
||||
|
||||
return info.Env().Undefined();
|
||||
}
|
59
node_modules/sharp/src/stats.h
generated
vendored
Normal file
59
node_modules/sharp/src/stats.h
generated
vendored
Normal file
@@ -0,0 +1,59 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
#ifndef SRC_STATS_H_
|
||||
#define SRC_STATS_H_
|
||||
|
||||
#include <string>
|
||||
#include <napi.h>
|
||||
|
||||
#include "./common.h"
|
||||
|
||||
struct ChannelStats {
|
||||
// stats per channel
|
||||
int min;
|
||||
int max;
|
||||
double sum;
|
||||
double squaresSum;
|
||||
double mean;
|
||||
double stdev;
|
||||
int minX;
|
||||
int minY;
|
||||
int maxX;
|
||||
int maxY;
|
||||
|
||||
ChannelStats(int minVal, int maxVal, double sumVal, double squaresSumVal,
|
||||
double meanVal, double stdevVal, int minXVal, int minYVal, int maxXVal, int maxYVal):
|
||||
min(minVal), max(maxVal), sum(sumVal), squaresSum(squaresSumVal),
|
||||
mean(meanVal), stdev(stdevVal), minX(minXVal), minY(minYVal), maxX(maxXVal), maxY(maxYVal) {}
|
||||
};
|
||||
|
||||
struct StatsBaton {
|
||||
// Input
|
||||
sharp::InputDescriptor *input;
|
||||
|
||||
// Output
|
||||
std::vector<ChannelStats> channelStats;
|
||||
bool isOpaque;
|
||||
double entropy;
|
||||
double sharpness;
|
||||
int dominantRed;
|
||||
int dominantGreen;
|
||||
int dominantBlue;
|
||||
|
||||
std::string err;
|
||||
|
||||
StatsBaton():
|
||||
input(nullptr),
|
||||
isOpaque(true),
|
||||
entropy(0.0),
|
||||
sharpness(0.0),
|
||||
dominantRed(0),
|
||||
dominantGreen(0),
|
||||
dominantBlue(0)
|
||||
{}
|
||||
};
|
||||
|
||||
Napi::Value stats(const Napi::CallbackInfo& info);
|
||||
|
||||
#endif // SRC_STATS_H_
|
269
node_modules/sharp/src/utilities.cc
generated
vendored
Normal file
269
node_modules/sharp/src/utilities.cc
generated
vendored
Normal file
@@ -0,0 +1,269 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
#include <cmath>
|
||||
#include <string>
|
||||
#include <cstdio>
|
||||
|
||||
#include <napi.h>
|
||||
#include <vips/vips8>
|
||||
#include <vips/vector.h>
|
||||
|
||||
#include "common.h"
|
||||
#include "operations.h"
|
||||
#include "utilities.h"
|
||||
|
||||
/*
|
||||
Get and set cache limits
|
||||
*/
|
||||
Napi::Value cache(const Napi::CallbackInfo& info) {
|
||||
Napi::Env env = info.Env();
|
||||
|
||||
// Set memory limit
|
||||
if (info[size_t(0)].IsNumber()) {
|
||||
vips_cache_set_max_mem(info[size_t(0)].As<Napi::Number>().Int32Value() * 1048576);
|
||||
}
|
||||
// Set file limit
|
||||
if (info[size_t(1)].IsNumber()) {
|
||||
vips_cache_set_max_files(info[size_t(1)].As<Napi::Number>().Int32Value());
|
||||
}
|
||||
// Set items limit
|
||||
if (info[size_t(2)].IsNumber()) {
|
||||
vips_cache_set_max(info[size_t(2)].As<Napi::Number>().Int32Value());
|
||||
}
|
||||
|
||||
// Get memory stats
|
||||
Napi::Object memory = Napi::Object::New(env);
|
||||
memory.Set("current", round(vips_tracked_get_mem() / 1048576));
|
||||
memory.Set("high", round(vips_tracked_get_mem_highwater() / 1048576));
|
||||
memory.Set("max", round(vips_cache_get_max_mem() / 1048576));
|
||||
// Get file stats
|
||||
Napi::Object files = Napi::Object::New(env);
|
||||
files.Set("current", vips_tracked_get_files());
|
||||
files.Set("max", vips_cache_get_max_files());
|
||||
|
||||
// Get item stats
|
||||
Napi::Object items = Napi::Object::New(env);
|
||||
items.Set("current", vips_cache_get_size());
|
||||
items.Set("max", vips_cache_get_max());
|
||||
|
||||
Napi::Object cache = Napi::Object::New(env);
|
||||
cache.Set("memory", memory);
|
||||
cache.Set("files", files);
|
||||
cache.Set("items", items);
|
||||
return cache;
|
||||
}
|
||||
|
||||
/*
|
||||
Get and set size of thread pool
|
||||
*/
|
||||
Napi::Value concurrency(const Napi::CallbackInfo& info) {
|
||||
// Set concurrency
|
||||
if (info[size_t(0)].IsNumber()) {
|
||||
vips_concurrency_set(info[size_t(0)].As<Napi::Number>().Int32Value());
|
||||
}
|
||||
// Get concurrency
|
||||
return Napi::Number::New(info.Env(), vips_concurrency_get());
|
||||
}
|
||||
|
||||
/*
|
||||
Get internal counters (queued tasks, processing tasks)
|
||||
*/
|
||||
Napi::Value counters(const Napi::CallbackInfo& info) {
|
||||
Napi::Object counters = Napi::Object::New(info.Env());
|
||||
counters.Set("queue", static_cast<int>(sharp::counterQueue));
|
||||
counters.Set("process", static_cast<int>(sharp::counterProcess));
|
||||
return counters;
|
||||
}
|
||||
|
||||
/*
|
||||
Get and set use of SIMD vector unit instructions
|
||||
*/
|
||||
Napi::Value simd(const Napi::CallbackInfo& info) {
|
||||
// Set state
|
||||
if (info[size_t(0)].IsBoolean()) {
|
||||
vips_vector_set_enabled(info[size_t(0)].As<Napi::Boolean>().Value());
|
||||
}
|
||||
// Get state
|
||||
return Napi::Boolean::New(info.Env(), vips_vector_isenabled());
|
||||
}
|
||||
|
||||
/*
|
||||
Get libvips version
|
||||
*/
|
||||
Napi::Value libvipsVersion(const Napi::CallbackInfo& info) {
|
||||
Napi::Env env = info.Env();
|
||||
Napi::Object version = Napi::Object::New(env);
|
||||
|
||||
char semver[9];
|
||||
std::snprintf(semver, sizeof(semver), "%d.%d.%d", vips_version(0), vips_version(1), vips_version(2));
|
||||
version.Set("semver", Napi::String::New(env, semver));
|
||||
#ifdef SHARP_USE_GLOBAL_LIBVIPS
|
||||
version.Set("isGlobal", Napi::Boolean::New(env, true));
|
||||
#else
|
||||
version.Set("isGlobal", Napi::Boolean::New(env, false));
|
||||
#endif
|
||||
#ifdef __EMSCRIPTEN__
|
||||
version.Set("isWasm", Napi::Boolean::New(env, true));
|
||||
#else
|
||||
version.Set("isWasm", Napi::Boolean::New(env, false));
|
||||
#endif
|
||||
return version;
|
||||
}
|
||||
|
||||
/*
|
||||
Get available input/output file/buffer/stream formats
|
||||
*/
|
||||
Napi::Value format(const Napi::CallbackInfo& info) {
|
||||
Napi::Env env = info.Env();
|
||||
Napi::Object format = Napi::Object::New(env);
|
||||
for (std::string const f : {
|
||||
"jpeg", "png", "webp", "tiff", "magick", "openslide", "dz",
|
||||
"ppm", "fits", "gif", "svg", "heif", "pdf", "vips", "jp2k", "jxl"
|
||||
}) {
|
||||
// Input
|
||||
const VipsObjectClass *oc = vips_class_find("VipsOperation", (f + "load").c_str());
|
||||
Napi::Boolean hasInputFile = Napi::Boolean::New(env, oc);
|
||||
Napi::Boolean hasInputBuffer =
|
||||
Napi::Boolean::New(env, vips_type_find("VipsOperation", (f + "load_buffer").c_str()));
|
||||
Napi::Object input = Napi::Object::New(env);
|
||||
input.Set("file", hasInputFile);
|
||||
input.Set("buffer", hasInputBuffer);
|
||||
input.Set("stream", hasInputBuffer);
|
||||
if (hasInputFile) {
|
||||
const VipsForeignClass *fc = VIPS_FOREIGN_CLASS(oc);
|
||||
if (fc->suffs) {
|
||||
Napi::Array fileSuffix = Napi::Array::New(env);
|
||||
const char **suffix = fc->suffs;
|
||||
for (int i = 0; *suffix; i++, suffix++) {
|
||||
fileSuffix.Set(i, Napi::String::New(env, *suffix));
|
||||
}
|
||||
input.Set("fileSuffix", fileSuffix);
|
||||
}
|
||||
}
|
||||
// Output
|
||||
Napi::Boolean hasOutputFile =
|
||||
Napi::Boolean::New(env, vips_type_find("VipsOperation", (f + "save").c_str()));
|
||||
Napi::Boolean hasOutputBuffer =
|
||||
Napi::Boolean::New(env, vips_type_find("VipsOperation", (f + "save_buffer").c_str()));
|
||||
Napi::Object output = Napi::Object::New(env);
|
||||
output.Set("file", hasOutputFile);
|
||||
output.Set("buffer", hasOutputBuffer);
|
||||
output.Set("stream", hasOutputBuffer);
|
||||
// Other attributes
|
||||
Napi::Object container = Napi::Object::New(env);
|
||||
container.Set("id", f);
|
||||
container.Set("input", input);
|
||||
container.Set("output", output);
|
||||
// Add to set of formats
|
||||
format.Set(f, container);
|
||||
}
|
||||
|
||||
// Raw, uncompressed data
|
||||
Napi::Boolean supported = Napi::Boolean::New(env, true);
|
||||
Napi::Boolean unsupported = Napi::Boolean::New(env, false);
|
||||
Napi::Object rawInput = Napi::Object::New(env);
|
||||
rawInput.Set("file", unsupported);
|
||||
rawInput.Set("buffer", supported);
|
||||
rawInput.Set("stream", supported);
|
||||
Napi::Object rawOutput = Napi::Object::New(env);
|
||||
rawOutput.Set("file", unsupported);
|
||||
rawOutput.Set("buffer", supported);
|
||||
rawOutput.Set("stream", supported);
|
||||
Napi::Object raw = Napi::Object::New(env);
|
||||
raw.Set("id", "raw");
|
||||
raw.Set("input", rawInput);
|
||||
raw.Set("output", rawOutput);
|
||||
format.Set("raw", raw);
|
||||
|
||||
return format;
|
||||
}
|
||||
|
||||
/*
|
||||
(Un)block libvips operations at runtime.
|
||||
*/
|
||||
void block(const Napi::CallbackInfo& info) {
|
||||
Napi::Array ops = info[size_t(0)].As<Napi::Array>();
|
||||
bool const state = info[size_t(1)].As<Napi::Boolean>().Value();
|
||||
for (unsigned int i = 0; i < ops.Length(); i++) {
|
||||
vips_operation_block_set(ops.Get(i).As<Napi::String>().Utf8Value().c_str(), state);
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
Synchronous, internal-only method used by some of the functional tests.
|
||||
Calculates the maximum colour distance using the DE2000 algorithm
|
||||
between two images of the same dimensions and number of channels.
|
||||
*/
|
||||
Napi::Value _maxColourDistance(const Napi::CallbackInfo& info) {
|
||||
Napi::Env env = info.Env();
|
||||
|
||||
// Open input files
|
||||
VImage image1;
|
||||
sharp::ImageType imageType1 = sharp::DetermineImageType(info[size_t(0)].As<Napi::String>().Utf8Value().data());
|
||||
if (imageType1 != sharp::ImageType::UNKNOWN) {
|
||||
try {
|
||||
image1 = VImage::new_from_file(info[size_t(0)].As<Napi::String>().Utf8Value().c_str());
|
||||
} catch (...) {
|
||||
throw Napi::Error::New(env, "Input file 1 has corrupt header");
|
||||
}
|
||||
} else {
|
||||
throw Napi::Error::New(env, "Input file 1 is of an unsupported image format");
|
||||
}
|
||||
VImage image2;
|
||||
sharp::ImageType imageType2 = sharp::DetermineImageType(info[size_t(1)].As<Napi::String>().Utf8Value().data());
|
||||
if (imageType2 != sharp::ImageType::UNKNOWN) {
|
||||
try {
|
||||
image2 = VImage::new_from_file(info[size_t(1)].As<Napi::String>().Utf8Value().c_str());
|
||||
} catch (...) {
|
||||
throw Napi::Error::New(env, "Input file 2 has corrupt header");
|
||||
}
|
||||
} else {
|
||||
throw Napi::Error::New(env, "Input file 2 is of an unsupported image format");
|
||||
}
|
||||
// Ensure same number of channels
|
||||
if (image1.bands() != image2.bands()) {
|
||||
throw Napi::Error::New(env, "mismatchedBands");
|
||||
}
|
||||
// Ensure same dimensions
|
||||
if (image1.width() != image2.width() || image1.height() != image2.height()) {
|
||||
throw Napi::Error::New(env, "mismatchedDimensions");
|
||||
}
|
||||
|
||||
double maxColourDistance;
|
||||
try {
|
||||
// Premultiply and remove alpha
|
||||
if (sharp::HasAlpha(image1)) {
|
||||
image1 = image1.premultiply().extract_band(1, VImage::option()->set("n", image1.bands() - 1));
|
||||
}
|
||||
if (sharp::HasAlpha(image2)) {
|
||||
image2 = image2.premultiply().extract_band(1, VImage::option()->set("n", image2.bands() - 1));
|
||||
}
|
||||
// Calculate colour distance
|
||||
maxColourDistance = image1.dE00(image2).max();
|
||||
} catch (vips::VError const &err) {
|
||||
throw Napi::Error::New(env, err.what());
|
||||
}
|
||||
|
||||
// Clean up libvips' per-request data and threads
|
||||
vips_error_clear();
|
||||
vips_thread_shutdown();
|
||||
|
||||
return Napi::Number::New(env, maxColourDistance);
|
||||
}
|
||||
|
||||
#if defined(__GNUC__)
|
||||
// mallctl will be resolved by the runtime linker when jemalloc is being used
|
||||
extern "C" {
|
||||
int mallctl(const char *name, void *oldp, size_t *oldlenp, void *newp, size_t newlen) __attribute__((weak));
|
||||
}
|
||||
Napi::Value _isUsingJemalloc(const Napi::CallbackInfo& info) {
|
||||
Napi::Env env = info.Env();
|
||||
return Napi::Boolean::New(env, mallctl != nullptr);
|
||||
}
|
||||
#else
|
||||
Napi::Value _isUsingJemalloc(const Napi::CallbackInfo& info) {
|
||||
Napi::Env env = info.Env();
|
||||
return Napi::Boolean::New(env, false);
|
||||
}
|
||||
#endif
|
19
node_modules/sharp/src/utilities.h
generated
vendored
Normal file
19
node_modules/sharp/src/utilities.h
generated
vendored
Normal file
@@ -0,0 +1,19 @@
|
||||
// Copyright 2013 Lovell Fuller and others.
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
#ifndef SRC_UTILITIES_H_
|
||||
#define SRC_UTILITIES_H_
|
||||
|
||||
#include <napi.h>
|
||||
|
||||
Napi::Value cache(const Napi::CallbackInfo& info);
|
||||
Napi::Value concurrency(const Napi::CallbackInfo& info);
|
||||
Napi::Value counters(const Napi::CallbackInfo& info);
|
||||
Napi::Value simd(const Napi::CallbackInfo& info);
|
||||
Napi::Value libvipsVersion(const Napi::CallbackInfo& info);
|
||||
Napi::Value format(const Napi::CallbackInfo& info);
|
||||
void block(const Napi::CallbackInfo& info);
|
||||
Napi::Value _maxColourDistance(const Napi::CallbackInfo& info);
|
||||
Napi::Value _isUsingJemalloc(const Napi::CallbackInfo& info);
|
||||
|
||||
#endif // SRC_UTILITIES_H_
|
Reference in New Issue
Block a user