Adding an optional image service based on Squoosh (#4738)

* WIP: adding a service built on @squoosh/lib

* WIP: investigating memory leaks in Squoosh

* WIP: vendoring Squoosh to work with our build

* chore: a bit of cleanup and a small perf gain

* removing a few unused deps

* fix: removing temp .only() in sharp test

* hooking up the last build steps to copy over .wasm files

* removing the duplicated lib/*.wasm files

* defaulting to Sharp for the initial @next release

* make sure pnpm always runs the postbuild script

* removing a few node dependencies

* refactor: move the copy .wasm build step out of the SSR bundle

* linter fixes

* fixing lock file

* chore: add TEMP changeset

* fix built wasm location for SSG builds

* Revert "defaulting to Sharp for the initial @next release"

This reverts commit 1a8d4f7f60.

* removing sharp dependency

* Revert "fix built wasm location for SSG builds"

This reverts commit 446b80bb53.

* chore: update lockfile

* fixing up image tests for the wasm loader

* updating the README for squoosh

* parallel wasm builds

* refactor: a bit of house keeping

* perf: allow a thread for each output encoding format

* fix: dev broke with the shift to wasm workers

* adds a new `astro:build:generated` hook for SSG builds

* fix: typo + calling cleanup methods in wasm codecs

* adding @astrojs/webapi for the TransformStream polyfill

* Revert "adding @astrojs/webapi for the TransformStream polyfill"

This reverts commit 39e5b845a5.

* perf: using sharp for most of the CI tests

* chore: update lockfile

* removing hard-coded squoosh imports

* fix: adding sharp to rollup externals

* test: using dev for the squoosh tests

* fix: updating the build output dir for wasm filles in SSG builds

* updating the changeset with migration details

* Revert "adds a new `astro:build:generated` hook for SSG builds"

This reverts commit 59b5fec7be.

* nit: adding comments for the wasm file copy

* chore: fix eslint warning
This commit is contained in:
Tony Sullivan 2022-09-22 19:48:14 +00:00 committed by GitHub
parent 6a1a17dd28
commit fad3867adb
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
82 changed files with 13865 additions and 542 deletions

View file

@ -0,0 +1,24 @@
---
'@astrojs/image': minor
---
Adds a new built-in image service based on web assembly libraries :drum: web container support!
**Migration:** Happy with the previous image service based on [`sharp`](https://sharp.pixelplumbing.com/)? No problem! Install `sharp` in your project and update your Astro config to match.
```sh
npm install sharp
```
```astro title="astro.config.mjs"
---
import image from '@astrojs/image';
export default {
// ...
integrations: [image({
serviceEntryPoint: '@astrojs/image/sharp'
})],
}
---
```

View file

@ -18,7 +18,7 @@ This **[Astro integration][astro-integration]** makes it easy to optimize images
Images play a big role in overall site performance and usability. Serving properly sized images makes all the difference but is often tricky to automate.
This integration provides `<Image />` and `<Picture>` components as well as a basic image transformer powered by [sharp](https://sharp.pixelplumbing.com/), with full support for static sites and server-side rendering. The built-in `sharp` transformer is also replaceable, opening the door for future integrations that work with your favorite hosted image service.
This integration provides `<Image />` and `<Picture>` components as well as a basic image transformer, with full support for static sites and server-side rendering. The built-in image transformer is also replaceable, opening the door for future integrations that work with your favorite hosted image service.
## Installation
@ -57,6 +57,31 @@ export default {
}
```
### Installing `sharp` (optional)
The default image transformer is based on [Squoosh](https://github.com/GoogleChromeLabs/squoosh) and uses web assembly libraries to support most deployment environments.
If you are building a static site or using an SSR deployment host that supports NodeJS, we recommend installing [sharp](https://sharp.pixelplumbing.com/) for faster builds and more fine-grained control of image transformations.
First, install the `sharp` package using your package manger. If you're using npm or aren't sure, run this in the terminal:
```sh
npm install sharp
```
Then, update the integration in you `astro.config.*` file to use the built-in `sharp` image transformer.
```astro title="astro.config.mjs"
---
import image from '@astrojs/image';
export default {
// ...
integrations: [image({
serviceEntryPoint: '@astrojs/image/sharp'
})],
}
---
```
### Update `env.d.ts`
For the best development experience, add the integrations type definitions to your project's `env.d.ts` file.
@ -85,7 +110,7 @@ import { Image, Picture } from '@astrojs/image/components';
---
```
The included `sharp` transformer supports resizing images and encoding them to different image formats. Third-party image services will be able to add support for custom transformations as well (ex: `blur`, `filter`, `rotate`, etc).
The included image transformers support resizing images and encoding them to different image formats. Third-party image services will be able to add support for custom transformations as well (ex: `blur`, `filter`, `rotate`, etc).
Astros `<Image />` and `<Picture />` components require the `alt` attribute, which provides descriptive text for images. A warning will be logged if alt text is missing, and a future release of the integration will throw an error if no alt text is provided.
@ -195,6 +220,8 @@ A `number` can also be provided, useful when the aspect ratio is calculated at b
**Default:** `undefined`
</p>
> This is not supported by the default Squoosh service. See the [installation section](#installing-sharp-optional) for details on using the `sharp` service instead.
The background color is used to fill the remaining background when using `contain` for the `fit` property.
The background color is also used for replacing the alpha channel with `sharp`'s `flatten` method. In case the output format
@ -215,6 +242,8 @@ color representation with 3 or 6 hexadecimal characters in the form `#123[abc]`,
**Default:** `'cover'`
</p>
> This is not supported by the default Squoosh service. See the [installation section](#installing-sharp-optional) for details on using the `sharp` service instead.
How the image should be resized to fit both `height` and `width`.
#### position
@ -225,6 +254,8 @@ How the image should be resized to fit both `height` and `width`.
**Default:** `'centre'`
</p>
> This is not supported by the default Squoosh service. See the [installation section](#installing-sharp-optional) for details on using the `sharp` service instead.
Position of the crop when fit is `cover` or `contain`.
### `<Picture />`
@ -316,6 +347,8 @@ The output formats to be used in the optimized image. If not provided, `webp` an
**Default:** `undefined`
</p>
> This is not supported by the default Squoosh service. See the [installation section](#installing-sharp-optional) for details on using the `sharp` service instead.
The background color to use for replacing the alpha channel with `sharp`'s `flatten` method. In case the output format
doesn't support transparency (i.e. `jpeg`), it's advisable to include a background color, otherwise black will be used
as default replacement for transparent pixels.
@ -334,6 +367,8 @@ color representation with 3 or 6 hexadecimal characters in the form `#123[abc]`,
**Default:** `'cover'`
</p>
> This is not supported by the default Squoosh service. See the [installation section](#installing-sharp-optional) for details on using the `sharp` service instead.
How the image should be resized to fit both `height` and `width`.
#### position
@ -346,6 +381,8 @@ How the image should be resized to fit both `height` and `width`.
**Default:** `'centre'`
</p>
> This is not supported by the default Squoosh service. See the [installation section](#installing-sharp-optional) for details on using the `sharp` service instead.
Position of the crop when fit is `cover` or `contain`.
### `getImage`
@ -380,12 +417,12 @@ This helper takes in an object with the same properties as the `<Picture />` com
The integration can be configured to run with a different image service, either a hosted image service or a full image transformer that runs locally in your build or SSR deployment.
> During development, local images may not have been published yet and would not be available to hosted image services. Local images will always use the built-in `sharp` service when using `astro dev`.
> During development, local images may not have been published yet and would not be available to hosted image services. Local images will always use the built-in image service when using `astro dev`.
### config.serviceEntryPoint
The `serviceEntryPoint` should resolve to the image service installed from NPM. The default entry point is `@astrojs/image/sharp`, which resolves to the entry point exported from this integration's `package.json`.
The `serviceEntryPoint` should resolve to the image service installed from NPM. The default entry point is `@astrojs/image/squoosh`, which resolves to the entry point exported from this integration's `package.json`.
```js
// astro.config.mjs

View file

@ -23,6 +23,7 @@
".": "./dist/index.js",
"./endpoint": "./dist/endpoint.js",
"./sharp": "./dist/loaders/sharp.js",
"./squoosh": "./dist/loaders/squoosh.js",
"./components": "./components/index.js",
"./package.json": "./package.json",
"./client": "./client.d.ts",
@ -34,8 +35,9 @@
"client.d.ts"
],
"scripts": {
"build": "astro-scripts build \"src/**/*.ts\" && tsc",
"build:ci": "astro-scripts build \"src/**/*.ts\"",
"build": "astro-scripts build \"src/**/*.ts\" && tsc && pnpm run postbuild",
"build:ci": "astro-scripts build \"src/**/*.ts\" && pnpm run postbuild",
"postbuild": "astro-scripts copy \"src/**/*.wasm\"",
"dev": "astro-scripts dev \"src/**/*.ts\"",
"test": "mocha --exit --timeout 20000 test"
},
@ -43,13 +45,14 @@
"@altano/tiny-async-pool": "^1.0.2",
"image-size": "^1.0.2",
"magic-string": "^0.25.9",
"mime": "^3.0.0",
"sharp": "^0.30.6"
"mime": "^3.0.0"
},
"devDependencies": {
"@types/sharp": "^0.30.5",
"astro": "workspace:*",
"astro-scripts": "workspace:*",
"kleur": "^4.1.4"
"kleur": "^4.1.4",
"rollup-plugin-copy": "^3.4.0",
"web-streams-polyfill": "^3.2.1"
}
}

View file

@ -6,10 +6,31 @@ import OS from 'node:os';
import path from 'node:path';
import { fileURLToPath } from 'node:url';
import type { SSRImageService, TransformOptions } from '../loaders/index.js';
import { loadLocalImage, loadRemoteImage } from '../utils/images.js';
import { debug, info, LoggerLevel, warn } from '../utils/logger.js';
import { isRemoteImage } from '../utils/paths.js';
async function loadLocalImage(src: string | URL) {
try {
return await fs.readFile(src);
} catch {
return undefined;
}
}
async function loadRemoteImage(src: string) {
try {
const res = await fetch(src);
if (!res.ok) {
return undefined;
}
return Buffer.from(await res.arrayBuffer());
} catch {
return undefined;
}
}
function getTimeStat(timeStart: number, timeEnd: number) {
const buildTime = timeEnd - timeStart;
return buildTime < 750 ? `${Math.round(buildTime)}ms` : `${(buildTime / 1000).toFixed(2)}s`;
@ -39,8 +60,6 @@ export async function ssgBuild({ loader, staticImages, config, outDir, logLevel
)}`,
});
const inputFiles = new Set<string>();
async function processStaticImage([src, transformsMap]: [
string,
Map<string, TransformOptions>
@ -61,9 +80,6 @@ export async function ssgBuild({ loader, staticImages, config, outDir, logLevel
const inputFileURL = new URL(`.${src}`, outDir);
inputFile = fileURLToPath(inputFileURL);
inputBuffer = await loadLocalImage(inputFile);
// track the local file used so the original can be copied over
inputFiles.add(inputFile);
}
if (!inputBuffer) {

View file

@ -48,6 +48,7 @@ export const get: APIRoute = async ({ request }) => {
},
});
} catch (err: unknown) {
console.error(err);
return new Response(`Server Error: ${err}`, { status: 500 });
}
};

View file

@ -1,9 +1,10 @@
import type { AstroConfig, AstroIntegration } from 'astro';
import type { AstroConfig, AstroIntegration, BuildConfig } from 'astro';
import { ssgBuild } from './build/ssg.js';
import type { ImageService, TransformOptions } from './loaders/index.js';
import type { ImageService, SSRImageService, TransformOptions } from './loaders/index.js';
import type { LoggerLevel } from './utils/logger.js';
import { joinPaths, prependForwardSlash, propsToFilename } from './utils/paths.js';
import { createPlugin } from './vite-plugin-astro-image.js';
import { copyWasmFiles } from './vendor/squoosh/copy-wasm.js';
export { getImage } from './lib/get-image.js';
export { getPicture } from './lib/get-picture.js';
@ -13,12 +14,13 @@ const ROUTE_PATTERN = '/_image';
interface ImageIntegration {
loader?: ImageService;
defaultLoader: SSRImageService;
addStaticImage?: (transform: TransformOptions) => string;
}
declare global {
// eslint-disable-next-line no-var
var astroImage: ImageIntegration | undefined;
var astroImage: ImageIntegration;
}
export interface IntegrationOptions {
@ -31,12 +33,13 @@ export interface IntegrationOptions {
export default function integration(options: IntegrationOptions = {}): AstroIntegration {
const resolvedOptions = {
serviceEntryPoint: '@astrojs/image/sharp',
serviceEntryPoint: '@astrojs/image/squoosh',
logLevel: 'info' as LoggerLevel,
...options,
};
let _config: AstroConfig;
let _buildConfig: BuildConfig;
// During SSG builds, this is used to track all transformed images required.
const staticImages = new Map<string, Map<string, TransformOptions>>();
@ -45,18 +48,26 @@ export default function integration(options: IntegrationOptions = {}): AstroInte
return {
plugins: [createPlugin(_config, resolvedOptions)],
optimizeDeps: {
include: ['image-size', 'sharp'],
include: [
'image-size',
].filter(Boolean),
},
build: {
rollupOptions: {
external: ["sharp"]
}
},
ssr: {
noExternal: ['@astrojs/image', resolvedOptions.serviceEntryPoint],
},
assetsInclude: ['**/*.wasm']
};
}
return {
name: PKG_NAME,
hooks: {
'astro:config:setup': ({ command, config, updateConfig, injectRoute }) => {
'astro:config:setup': async ({ command, config, updateConfig, injectRoute }) => {
_config = config;
updateConfig({ vite: getViteConfiguration() });
@ -67,8 +78,20 @@ export default function integration(options: IntegrationOptions = {}): AstroInte
entryPoint: '@astrojs/image/endpoint',
});
}
const { default: defaultLoader } = await import(resolvedOptions.serviceEntryPoint === '@astrojs/image/sharp'
? './loaders/sharp.js'
: './loaders/squoosh.js'
);
globalThis.astroImage = {
defaultLoader
}
},
'astro:build:setup': () => {
'astro:build:start': async ({ buildConfig }) => {
_buildConfig = buildConfig
},
'astro:build:setup': async () => {
// Used to cache all images rendered to HTML
// Added to globalThis to share the same map in Node and Vite
function addStaticImage(transform: TransformOptions) {
@ -89,29 +112,36 @@ export default function integration(options: IntegrationOptions = {}): AstroInte
}
// Helpers for building static images should only be available for SSG
globalThis.astroImage =
_config.output === 'static'
? {
addStaticImage,
}
: {};
},
'astro:build:done': async ({ dir }) => {
if (_config.output === 'static') {
// for SSG builds, build all requested image transforms to dist
const loader = globalThis?.astroImage?.loader;
if (loader && 'transform' in loader && staticImages.size > 0) {
await ssgBuild({
loader,
staticImages,
config: _config,
outDir: dir,
logLevel: resolvedOptions.logLevel,
});
}
globalThis.astroImage.addStaticImage = addStaticImage;
}
},
'astro:build:generated': async ({ dir }) => {
// for SSG builds, build all requested image transforms to dist
const loader = globalThis?.astroImage?.loader;
if (resolvedOptions.serviceEntryPoint === '@astrojs/image/squoosh') {
// For the Squoosh service, copy all wasm files to dist/chunks.
// Because the default loader is dynamically imported (above),
// Vite will bundle squoosh to dist/chunks and expect to find the wasm files there
await copyWasmFiles(new URL('./chunks', dir));
}
if (loader && 'transform' in loader && staticImages.size > 0) {
await ssgBuild({
loader,
staticImages,
config: _config,
outDir: dir,
logLevel: resolvedOptions.logLevel,
});
}
},
'astro:build:ssr': async () => {
if (resolvedOptions.serviceEntryPoint === '@astrojs/image/squoosh') {
await copyWasmFiles(_buildConfig.server);
}
}
},
};
}

View file

@ -6,7 +6,6 @@ import type {
TransformOptions,
} from '../loaders/index.js';
import { isSSRService, parseAspectRatio } from '../loaders/index.js';
import sharp from '../loaders/sharp.js';
import { isRemoteImage } from '../utils/paths.js';
import type { ImageMetadata } from '../vite-plugin-astro-image.js';
@ -131,7 +130,7 @@ export async function getImage(
const isDev = import.meta.env?.DEV;
const isLocalImage = !isRemoteImage(resolved.src);
const _loader = isDev && isLocalImage ? sharp : loader;
const _loader = isDev && isLocalImage ? globalThis.astroImage.defaultLoader : loader;
if (!_loader) {
throw new Error('@astrojs/image: loader not found!');
@ -139,7 +138,7 @@ export async function getImage(
const { searchParams } = isSSRService(_loader)
? _loader.serializeTransform(resolved)
: sharp.serializeTransform(resolved);
: globalThis.astroImage.defaultLoader.serializeTransform(resolved);
let src: string;

View file

@ -1,9 +1,9 @@
/// <reference types="astro/astro-jsx" />
import mime from 'mime';
import { extname } from 'node:path';
import { OutputFormat, parseAspectRatio, TransformOptions } from '../loaders/index.js';
import { ImageMetadata } from '../vite-plugin-astro-image.js';
import { getImage } from './get-image.js';
import { extname } from '../utils/paths.js';
export interface GetPictureParams {
src: string | ImageMetadata | Promise<{ default: ImageMetadata }>;

View file

@ -1,4 +1,5 @@
import { htmlColorNames, type NamedColor } from './colornames.js';
import { AstroConfig } from 'astro';
import { htmlColorNames, type NamedColor } from '../utils/colornames.js';
/// <reference types="astro/astro-jsx" />
export type InputFormat =
@ -13,7 +14,7 @@ export type InputFormat =
| 'gif';
export type OutputFormatSupportsAlpha = 'avif' | 'png' | 'webp';
export type OutputFormat = OutputFormatSupportsAlpha | 'jpeg';
export type OutputFormat = OutputFormatSupportsAlpha | 'jpeg' | 'jpg';
export type ColorDefinition =
| NamedColor
@ -49,7 +50,7 @@ export type CropPosition =
| 'attention';
export function isOutputFormat(value: string): value is OutputFormat {
return ['avif', 'jpeg', 'png', 'webp'].includes(value);
return ['avif', 'jpeg', 'jpg', 'png', 'webp'].includes(value);
}
export function isOutputFormatSupportsAlpha(value: string): value is OutputFormatSupportsAlpha {
@ -194,3 +195,109 @@ export function isHostedService(service: ImageService): service is ImageService
export function isSSRService(service: ImageService): service is SSRImageService {
return 'transform' in service;
}
export abstract class BaseSSRService implements SSRImageService {
async getImageAttributes(transform: TransformOptions) {
// strip off the known attributes
const { width, height, src, format, quality, aspectRatio, ...rest } = transform;
return {
...rest,
width: width,
height: height,
};
}
serializeTransform(transform: TransformOptions) {
const searchParams = new URLSearchParams();
if (transform.quality) {
searchParams.append('q', transform.quality.toString());
}
if (transform.format) {
searchParams.append('f', transform.format);
}
if (transform.width) {
searchParams.append('w', transform.width.toString());
}
if (transform.height) {
searchParams.append('h', transform.height.toString());
}
if (transform.aspectRatio) {
searchParams.append('ar', transform.aspectRatio.toString());
}
if (transform.fit) {
searchParams.append('fit', transform.fit);
}
if (transform.background) {
searchParams.append('bg', transform.background);
}
if (transform.position) {
searchParams.append('p', encodeURI(transform.position));
}
searchParams.append('href', transform.src);
return { searchParams };
}
parseTransform(searchParams: URLSearchParams) {
if (!searchParams.has('href')) {
return undefined;
}
let transform: TransformOptions = { src: searchParams.get('href')! };
if (searchParams.has('q')) {
transform.quality = parseInt(searchParams.get('q')!);
}
if (searchParams.has('f')) {
const format = searchParams.get('f')!;
if (isOutputFormat(format)) {
transform.format = format;
}
}
if (searchParams.has('w')) {
transform.width = parseInt(searchParams.get('w')!);
}
if (searchParams.has('h')) {
transform.height = parseInt(searchParams.get('h')!);
}
if (searchParams.has('ar')) {
const ratio = searchParams.get('ar')!;
if (isAspectRatioString(ratio)) {
transform.aspectRatio = ratio;
} else {
transform.aspectRatio = parseFloat(ratio);
}
}
if (searchParams.has('fit')) {
transform.fit = searchParams.get('fit') as typeof transform.fit;
}
if (searchParams.has('p')) {
transform.position = decodeURI(searchParams.get('p')!) as typeof transform.position;
}
if (searchParams.has('bg')) {
transform.background = searchParams.get('bg') as ColorDefinition;
}
return transform;
}
abstract transform(inputBuffer: Buffer, transform: TransformOptions): Promise<{ data: Buffer, format: OutputFormat }>;
}

View file

@ -1,110 +1,9 @@
import sharp from 'sharp';
import {
ColorDefinition,
isAspectRatioString,
isOutputFormat,
isOutputFormatSupportsAlpha,
} from '../loaders/index.js';
import type { OutputFormat, SSRImageService, TransformOptions } from './index.js';
class SharpService implements SSRImageService {
async getImageAttributes(transform: TransformOptions) {
// strip off the known attributes
const { width, height, src, format, quality, aspectRatio, fit, position, background, ...rest } =
transform;
return {
...rest,
width: width,
height: height,
};
}
serializeTransform(transform: TransformOptions) {
const searchParams = new URLSearchParams();
if (transform.quality) {
searchParams.append('q', transform.quality.toString());
}
if (transform.format) {
searchParams.append('f', transform.format);
}
if (transform.width) {
searchParams.append('w', transform.width.toString());
}
if (transform.height) {
searchParams.append('h', transform.height.toString());
}
if (transform.aspectRatio) {
searchParams.append('ar', transform.aspectRatio.toString());
}
if (transform.fit) {
searchParams.append('fit', transform.fit);
}
if (transform.background) {
searchParams.append('bg', transform.background);
}
if (transform.position) {
searchParams.append('p', encodeURI(transform.position));
}
return { searchParams };
}
parseTransform(searchParams: URLSearchParams) {
let transform: TransformOptions = { src: searchParams.get('href')! };
if (searchParams.has('q')) {
transform.quality = parseInt(searchParams.get('q')!);
}
if (searchParams.has('f')) {
const format = searchParams.get('f')!;
if (isOutputFormat(format)) {
transform.format = format;
}
}
if (searchParams.has('w')) {
transform.width = parseInt(searchParams.get('w')!);
}
if (searchParams.has('h')) {
transform.height = parseInt(searchParams.get('h')!);
}
if (searchParams.has('ar')) {
const ratio = searchParams.get('ar')!;
if (isAspectRatioString(ratio)) {
transform.aspectRatio = ratio;
} else {
transform.aspectRatio = parseFloat(ratio);
}
}
if (searchParams.has('fit')) {
transform.fit = searchParams.get('fit') as typeof transform.fit;
}
if (searchParams.has('p')) {
transform.position = decodeURI(searchParams.get('p')!) as typeof transform.position;
}
if (searchParams.has('bg')) {
transform.background = searchParams.get('bg') as ColorDefinition | undefined;
}
return transform;
}
import { BaseSSRService, isOutputFormatSupportsAlpha } from '../loaders/index.js';
import type { SSRImageService } from '../loaders/index.js';
import type { OutputFormat, TransformOptions } from './index.js';
class SharpService extends BaseSSRService {
async transform(inputBuffer: Buffer, transform: TransformOptions) {
const sharpImage = sharp(inputBuffer, { failOnError: false, pages: -1 });

View file

@ -0,0 +1,122 @@
// @ts-ignore
import { red } from 'kleur/colors';
import { BaseSSRService } from './index.js';
import { error } from '../utils/logger.js';
import { metadata } from '../utils/metadata.js';
import { isRemoteImage } from '../utils/paths.js';
import type { OutputFormat, TransformOptions } from './index.js';
import { processBuffer } from '../vendor/squoosh/image-pool.js';
import type { Operation } from '../vendor/squoosh/image.js';
class SquooshService extends BaseSSRService {
async processAvif(image: any, transform: TransformOptions) {
const encodeOptions = transform.quality
? { avif: { quality: transform.quality } }
: { avif: {} };
await image.encode(encodeOptions);
const data = await image.encodedWith.avif;
return {
data: data.binary,
format: 'avif' as OutputFormat,
};
}
async processJpeg(image: any, transform: TransformOptions) {
const encodeOptions = transform.quality
? { mozjpeg: { quality: transform.quality } }
: { mozjpeg: {} };
await image.encode(encodeOptions);
const data = await image.encodedWith.mozjpeg;
return {
data: data.binary,
format: 'jpeg' as OutputFormat,
};
}
async processPng(image: any, transform: TransformOptions) {
await image.encode({ oxipng: {} });
const data = await image.encodedWith.oxipng;
return {
data: data.binary,
format: 'png' as OutputFormat,
};
}
async processWebp(image: any, transform: TransformOptions) {
const encodeOptions = transform.quality
? { webp: { quality: transform.quality } }
: { webp: {} };
await image.encode(encodeOptions);
const data = await image.encodedWith.webp;
return {
data: data.binary,
format: 'webp' as OutputFormat,
};
}
async autorotate(transform: TransformOptions, inputBuffer: Buffer): Promise<Operation | undefined> {
// check EXIF orientation data and rotate the image if needed
try {
const meta = await metadata(transform.src, inputBuffer);
switch (meta?.orientation) {
case 3:
case 4:
return { type: 'rotate', numRotations: 2 };
case 5:
case 6:
return { type: 'rotate', numRotations: 1 };
case 7:
case 8:
return { type: 'rotate', numRotations: 3 };
}
} catch { }
}
async transform(inputBuffer: Buffer, transform: TransformOptions) {
const operations: Operation[] = [];
if (!isRemoteImage(transform.src)) {
const autorotate = await this.autorotate(transform, inputBuffer)
if (autorotate) {
operations.push(autorotate);
}
}
if (transform.width || transform.height) {
const width = transform.width && Math.round(transform.width);
const height = transform.height && Math.round(transform.height);
operations.push({
type: 'resize',
width,
height,
})
}
if (!transform.format) {
error({
level: 'info',
prefix: false,
message: red(`Unknown image output: "${transform.format}" used for ${transform.src}`),
});
throw new Error(`Unknown image output: "${transform.format}" used for ${transform.src}`);
}
const data = await processBuffer(inputBuffer, operations, transform.format, transform.quality || 100);
return {
data: Buffer.from(data),
format: transform.format
}
}
}
const service = new SquooshService();
export default service;

View file

@ -0,0 +1,14 @@
export default function execOnce<T extends (...args: any[]) => ReturnType<T>>(
fn: T
): T {
let used = false
let result: ReturnType<T>
return ((...args: any[]) => {
if (!used) {
used = true
result = fn(...args)
}
return result
}) as T
}

View file

@ -1,23 +0,0 @@
import fs from 'node:fs/promises';
export async function loadLocalImage(src: string | URL) {
try {
return await fs.readFile(src);
} catch {
return undefined;
}
}
export async function loadRemoteImage(src: string) {
try {
const res = await fetch(src);
if (!res.ok) {
return undefined;
}
return Buffer.from(await res.arrayBuffer());
} catch {
return undefined;
}
}

View file

@ -4,8 +4,12 @@ import { fileURLToPath } from 'node:url';
import { InputFormat } from '../loaders/index.js';
import { ImageMetadata } from '../vite-plugin-astro-image.js';
export async function metadata(src: URL): Promise<ImageMetadata | undefined> {
const file = await fs.readFile(src);
export interface Metadata extends ImageMetadata {
orientation?: number;
}
export async function metadata(src: URL | string, data?: Buffer): Promise<Metadata | undefined> {
const file = data || await fs.readFile(src);
const { width, height, type, orientation } = await sizeOf(file);
const isPortrait = (orientation || 0) >= 5;
@ -19,5 +23,6 @@ export async function metadata(src: URL): Promise<ImageMetadata | undefined> {
width: isPortrait ? height : width,
height: isPortrait ? width : height,
format: type as InputFormat,
orientation,
};
}

View file

@ -10,14 +10,15 @@ function removeQueryString(src: string) {
return index > 0 ? src.substring(0, index) : src;
}
function extname(src: string, format?: OutputFormat) {
const index = src.lastIndexOf('.');
export function extname(src: string) {
const base = basename(src);
const index = base.lastIndexOf('.');
if (index <= 0) {
return '';
}
return src.substring(index);
return src.substring(src.length - (base.length - index));
}
function removeExtname(src: string) {

View file

@ -0,0 +1,125 @@
/* tslint-disable ban-types */
import { Worker, parentPort } from 'worker_threads';
import { TransformStream } from 'web-streams-polyfill';
function uuid() {
return Array.from({ length: 16 }, () =>
Math.floor(Math.random() * 256).toString(16),
).join('');
}
interface Job<I> {
msg: I;
resolve: (result: any) => void;
reject: (reason: any) => void;
}
export default class WorkerPool<I, O> {
public numWorkers: number;
public jobQueue: TransformStream<Job<I>, Job<I>>;
public workerQueue: TransformStream<Worker, Worker>;
public done: Promise<void>;
constructor(numWorkers: number, workerFile: string) {
this.numWorkers = numWorkers;
this.jobQueue = new TransformStream();
this.workerQueue = new TransformStream();
const writer = this.workerQueue.writable.getWriter();
for (let i = 0; i < numWorkers; i++) {
writer.write(new Worker(workerFile));
}
writer.releaseLock();
this.done = this._readLoop();
}
async _readLoop() {
const reader = this.jobQueue.readable.getReader();
while (true) {
const { value, done } = await reader.read();
if (done) {
await this._terminateAll();
return;
}
if (!value) {
throw new Error('Reader did not return any value');
}
const { msg, resolve, reject } = value;
const worker = await this._nextWorker();
this.jobPromise(worker, msg)
.then((result) => resolve(result))
.catch((reason) => reject(reason))
.finally(() => {
// Return the worker to the pool
const writer = this.workerQueue.writable.getWriter();
writer.write(worker);
writer.releaseLock();
});
}
}
async _nextWorker() {
const reader = this.workerQueue.readable.getReader();
const { value } = await reader.read();
reader.releaseLock();
if (!value) {
throw new Error('No worker left');
}
return value;
}
async _terminateAll() {
for (let n = 0; n < this.numWorkers; n++) {
const worker = await this._nextWorker();
worker.terminate();
}
this.workerQueue.writable.close();
}
async join() {
this.jobQueue.writable.getWriter().close();
await this.done;
}
dispatchJob(msg: I): Promise<O> {
return new Promise((resolve, reject) => {
const writer = this.jobQueue.writable.getWriter();
writer.write({ msg, resolve, reject });
writer.releaseLock();
});
}
private jobPromise(worker: Worker, msg: I) {
return new Promise((resolve, reject) => {
const id = uuid();
worker.postMessage({ msg, id });
worker.on('message', function f({ error, result, id: rid }) {
if (rid !== id) {
return;
}
if (error) {
reject(error);
return;
}
worker.off('message', f);
resolve(result);
});
});
}
static useThisThreadAsWorker<I, O>(cb: (msg: I) => O) {
parentPort!.on('message', async (data) => {
const { msg, id } = data;
try {
const result = await cb(msg);
parentPort!.postMessage({ result, id });
} catch (e: any) {
parentPort!.postMessage({ error: e.message, id });
}
});
}
}

View file

@ -0,0 +1,245 @@
Skip to content
Pull requests
Issues
Marketplace
Explore
@tony-sull
vercel /
next.js
Public
Code
Issues 1.1k
Pull requests 216
Discussions
Actions
Security 8
Insights
next.js/packages/next/server/lib/squoosh/LICENSE
@timneutkens
timneutkens Move next-server directory files to server directory (#26756)
Latest commit 5b9ad8d on Jun 30, 2021
History
1 contributor
202 lines (169 sloc) 11.1 KB
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
Footer
© 2022 GitHub, Inc.
Footer navigation
Terms
Privacy
Security
Status
Docs
Contact GitHub
Pricing
API
Training
Blog
About

View file

@ -0,0 +1,32 @@
// eslint-disable-next-line no-shadow
export const enum AVIFTune {
auto,
psnr,
ssim,
}
export interface EncodeOptions {
cqLevel: number
denoiseLevel: number
cqAlphaLevel: number
tileRowsLog2: number
tileColsLog2: number
speed: number
subsample: number
chromaDeltaQ: boolean
sharpness: number
tune: AVIFTune
}
export interface AVIFModule extends EmscriptenWasm.Module {
encode(
data: BufferSource,
width: number,
height: number,
options: EncodeOptions
): Uint8Array
}
declare var moduleFactory: EmscriptenWasm.ModuleFactory<AVIFModule>
export default moduleFactory

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,373 @@
import { promises as fsp } from 'node:fs'
import { instantiateEmscriptenWasm, pathify } from './emscripten-utils.js'
interface DecodeModule extends EmscriptenWasm.Module {
decode: (data: Uint8Array) => ImageData
}
type DecodeModuleFactory = EmscriptenWasm.ModuleFactory<DecodeModule>
interface RotateModuleInstance {
exports: {
memory: WebAssembly.Memory
rotate(width: number, height: number, rotate: number): void
}
}
interface ResizeWithAspectParams {
input_width: number
input_height: number
target_width?: number
target_height?: number
}
export interface ResizeOptions {
width?: number
height?: number
method: 'triangle' | 'catrom' | 'mitchell' | 'lanczos3'
premultiply: boolean
linearRGB: boolean
}
export interface RotateOptions {
numRotations: number
}
// MozJPEG
import type { MozJPEGModule as MozJPEGEncodeModule } from './mozjpeg/mozjpeg_enc'
// @ts-ignore
import mozEnc from './mozjpeg/mozjpeg_node_enc.js'
const mozEncWasm = new URL('./mozjpeg/mozjpeg_node_enc.wasm', import.meta.url)
// @ts-ignore
import mozDec from './mozjpeg/mozjpeg_node_dec.js'
const mozDecWasm = new URL('./mozjpeg/mozjpeg_node_dec.wasm', import.meta.url)
// WebP
import type { WebPModule as WebPEncodeModule } from './webp/webp_enc'
// @ts-ignore
import webpEnc from './webp/webp_node_enc.js'
const webpEncWasm = new URL('./webp/webp_node_enc.wasm', import.meta.url)
// @ts-ignore
import webpDec from './webp/webp_node_dec.js'
const webpDecWasm = new URL('./webp/webp_node_dec.wasm', import.meta.url)
// AVIF
import type { AVIFModule as AVIFEncodeModule } from './avif/avif_enc'
// @ts-ignore
import avifEnc from './avif/avif_node_enc.js'
const avifEncWasm = new URL('./avif/avif_node_enc.wasm', import.meta.url)
// @ts-ignore
import avifDec from './avif/avif_node_dec.js'
const avifDecWasm = new URL('./avif/avif_node_dec.wasm', import.meta.url)
// PNG
// @ts-ignore
import * as pngEncDec from './png/squoosh_png.js'
const pngEncDecWasm = new URL('./png/squoosh_png_bg.wasm', import.meta.url)
const pngEncDecInit = () =>
pngEncDec.default(fsp.readFile(pathify(pngEncDecWasm.toString())))
// OxiPNG
// @ts-ignore
import * as oxipng from './png/squoosh_oxipng.js'
const oxipngWasm = new URL('./png/squoosh_oxipng_bg.wasm', import.meta.url)
const oxipngInit = () => oxipng.default(fsp.readFile(pathify(oxipngWasm.toString())))
// Resize
// @ts-ignore
import * as resize from './resize/squoosh_resize.js'
const resizeWasm = new URL('./resize/squoosh_resize_bg.wasm', import.meta.url)
const resizeInit = () => resize.default(fsp.readFile(pathify(resizeWasm.toString())))
// rotate
const rotateWasm = new URL('./rotate/rotate.wasm', import.meta.url)
// Our decoders currently rely on a `ImageData` global.
import ImageData from './image_data.js';
(global as any).ImageData = ImageData
function resizeNameToIndex(
name: 'triangle' | 'catrom' | 'mitchell' | 'lanczos3'
) {
switch (name) {
case 'triangle':
return 0
case 'catrom':
return 1
case 'mitchell':
return 2
case 'lanczos3':
return 3
default:
throw Error(`Unknown resize algorithm "${name}"`)
}
}
function resizeWithAspect({
input_width,
input_height,
target_width,
target_height,
}: ResizeWithAspectParams): { width: number; height: number } {
if (!target_width && !target_height) {
throw Error('Need to specify at least width or height when resizing')
}
if (target_width && target_height) {
return { width: target_width, height: target_height }
}
if (!target_width) {
return {
width: Math.round((input_width / input_height) * target_height!),
height: target_height!,
}
}
return {
width: target_width,
height: Math.round((input_height / input_width) * target_width),
}
}
export const preprocessors = {
resize: {
name: 'Resize',
description: 'Resize the image before compressing',
instantiate: async () => {
await resizeInit()
return (
buffer: Uint8Array,
input_width: number,
input_height: number,
{ width, height, method, premultiply, linearRGB }: ResizeOptions
) => {
;({ width, height } = resizeWithAspect({
input_width,
input_height,
target_width: width,
target_height: height,
}))
const imageData = new ImageData(
resize.resize(
buffer,
input_width,
input_height,
width,
height,
resizeNameToIndex(method),
premultiply,
linearRGB
),
width,
height
)
resize.cleanup()
return imageData
}
},
defaultOptions: {
method: 'lanczos3',
fitMethod: 'stretch',
premultiply: true,
linearRGB: true,
},
},
rotate: {
name: 'Rotate',
description: 'Rotate image',
instantiate: async () => {
return async (
buffer: Uint8Array,
width: number,
height: number,
{ numRotations }: RotateOptions
) => {
const degrees = (numRotations * 90) % 360
const sameDimensions = degrees === 0 || degrees === 180
const size = width * height * 4
const instance = (
await WebAssembly.instantiate(await fsp.readFile(pathify(rotateWasm.toString())))
).instance as RotateModuleInstance
const { memory } = instance.exports
const additionalPagesNeeded = Math.ceil(
(size * 2 - memory.buffer.byteLength + 8) / (64 * 1024)
)
if (additionalPagesNeeded > 0) {
memory.grow(additionalPagesNeeded)
}
const view = new Uint8ClampedArray(memory.buffer)
view.set(buffer, 8)
instance.exports.rotate(width, height, degrees)
return new ImageData(
view.slice(size + 8, size * 2 + 8),
sameDimensions ? width : height,
sameDimensions ? height : width
)
}
},
defaultOptions: {
numRotations: 0,
},
},
} as const
export const codecs = {
mozjpeg: {
name: 'MozJPEG',
extension: 'jpg',
detectors: [/^\xFF\xD8\xFF/],
dec: () =>
instantiateEmscriptenWasm(mozDec as DecodeModuleFactory, mozDecWasm.toString()),
enc: () =>
instantiateEmscriptenWasm(
mozEnc as EmscriptenWasm.ModuleFactory<MozJPEGEncodeModule>,
mozEncWasm.toString()
),
defaultEncoderOptions: {
quality: 75,
baseline: false,
arithmetic: false,
progressive: true,
optimize_coding: true,
smoothing: 0,
color_space: 3 /*YCbCr*/,
quant_table: 3,
trellis_multipass: false,
trellis_opt_zero: false,
trellis_opt_table: false,
trellis_loops: 1,
auto_subsample: true,
chroma_subsample: 2,
separate_chroma_quality: false,
chroma_quality: 75,
},
autoOptimize: {
option: 'quality',
min: 0,
max: 100,
},
},
webp: {
name: 'WebP',
extension: 'webp',
detectors: [/^RIFF....WEBPVP8[LX ]/s],
dec: () =>
instantiateEmscriptenWasm(webpDec as DecodeModuleFactory, webpDecWasm.toString()),
enc: () =>
instantiateEmscriptenWasm(
webpEnc as EmscriptenWasm.ModuleFactory<WebPEncodeModule>,
webpEncWasm.toString()
),
defaultEncoderOptions: {
quality: 75,
target_size: 0,
target_PSNR: 0,
method: 4,
sns_strength: 50,
filter_strength: 60,
filter_sharpness: 0,
filter_type: 1,
partitions: 0,
segments: 4,
pass: 1,
show_compressed: 0,
preprocessing: 0,
autofilter: 0,
partition_limit: 0,
alpha_compression: 1,
alpha_filtering: 1,
alpha_quality: 100,
lossless: 0,
exact: 0,
image_hint: 0,
emulate_jpeg_size: 0,
thread_level: 0,
low_memory: 0,
near_lossless: 100,
use_delta_palette: 0,
use_sharp_yuv: 0,
},
autoOptimize: {
option: 'quality',
min: 0,
max: 100,
},
},
avif: {
name: 'AVIF',
extension: 'avif',
// eslint-disable-next-line no-control-regex
detectors: [/^\x00\x00\x00 ftypavif\x00\x00\x00\x00/],
dec: () =>
instantiateEmscriptenWasm(avifDec as DecodeModuleFactory, avifDecWasm.toString()),
enc: async () => {
return instantiateEmscriptenWasm(
avifEnc as EmscriptenWasm.ModuleFactory<AVIFEncodeModule>,
avifEncWasm.toString()
)
},
defaultEncoderOptions: {
cqLevel: 33,
cqAlphaLevel: -1,
denoiseLevel: 0,
tileColsLog2: 0,
tileRowsLog2: 0,
speed: 6,
subsample: 1,
chromaDeltaQ: false,
sharpness: 0,
tune: 0 /* AVIFTune.auto */,
},
autoOptimize: {
option: 'cqLevel',
min: 62,
max: 0,
},
},
oxipng: {
name: 'OxiPNG',
extension: 'png',
// eslint-disable-next-line no-control-regex
detectors: [/^\x89PNG\x0D\x0A\x1A\x0A/],
dec: async () => {
await pngEncDecInit()
return {
decode: (buffer: Buffer | Uint8Array) => {
const imageData = pngEncDec.decode(buffer)
pngEncDec.cleanup()
return imageData
},
}
},
enc: async () => {
await pngEncDecInit()
await oxipngInit()
return {
encode: (
buffer: Uint8ClampedArray | ArrayBuffer,
width: number,
height: number,
opts: { level: number }
) => {
const simplePng = pngEncDec.encode(
new Uint8Array(buffer),
width,
height
)
const imageData = oxipng.optimise(simplePng, opts.level, false)
oxipng.cleanup()
return imageData
},
}
},
defaultEncoderOptions: {
level: 2,
},
autoOptimize: {
option: 'level',
min: 6,
max: 1,
},
},
} as const

View file

@ -0,0 +1,24 @@
import fs from 'node:fs/promises';
import path from 'node:path';
import { fileURLToPath } from 'node:url';
export async function copyWasmFiles(dir: URL) {
const src = new URL('./', import.meta.url);
await copyDir(fileURLToPath(src), fileURLToPath(dir));
}
async function copyDir(src: string, dest: string) {
const itemNames = await fs.readdir(src);
await Promise.all(itemNames.map(async (srcName) => {
const srcPath = path.join(src, srcName);
const destPath = path.join(dest, srcName);
const s = await fs.stat(srcPath);
if (s.isFile() && /.wasm$/.test(srcPath)) {
await fs.mkdir(path.dirname(destPath), { recursive: true });
await fs.copyFile(srcPath, destPath);
}
else if (s.isDirectory()) {
await copyDir(srcPath, destPath);
}
}));
}

View file

@ -0,0 +1,121 @@
// These types roughly model the object that the JS files generated by Emscripten define. Copied from https://github.com/DefinitelyTyped/DefinitelyTyped/blob/master/types/emscripten/index.d.ts and turned into a type definition rather than a global to support our way of using Emscripten.
declare namespace EmscriptenWasm {
type ModuleFactory<T extends Module = Module> = (
moduleOverrides?: ModuleOpts
) => Promise<T>
type EnvironmentType = 'WEB' | 'NODE' | 'SHELL' | 'WORKER'
// Options object for modularized Emscripten files. Shoe-horned by @surma.
// FIXME: This an incomplete definition!
interface ModuleOpts {
mainScriptUrlOrBlob?: string
noInitialRun?: boolean
locateFile?: (url: string) => string
onRuntimeInitialized?: () => void
}
interface Module {
print(str: string): void
printErr(str: string): void
arguments: string[]
environment: EnvironmentType
preInit: { (): void }[]
preRun: { (): void }[]
postRun: { (): void }[]
preinitializedWebGLContext: WebGLRenderingContext
noInitialRun: boolean
noExitRuntime: boolean
logReadFiles: boolean
filePackagePrefixURL: string
wasmBinary: ArrayBuffer
destroy(object: object): void
getPreloadedPackage(
remotePackageName: string,
remotePackageSize: number
): ArrayBuffer
instantiateWasm(
imports: WebAssembly.Imports,
successCallback: (module: WebAssembly.Module) => void
): WebAssembly.Exports
locateFile(url: string): string
onCustomMessage(event: MessageEvent): void
Runtime: any
ccall(
ident: string,
returnType: string | null,
argTypes: string[],
args: any[]
): any
cwrap(ident: string, returnType: string | null, argTypes: string[]): any
setValue(ptr: number, value: any, type: string, noSafe?: boolean): void
getValue(ptr: number, type: string, noSafe?: boolean): number
ALLOC_NORMAL: number
ALLOC_STACK: number
ALLOC_STATIC: number
ALLOC_DYNAMIC: number
ALLOC_NONE: number
allocate(slab: any, types: string, allocator: number, ptr: number): number
allocate(slab: any, types: string[], allocator: number, ptr: number): number
Pointer_stringify(ptr: number, length?: number): string
UTF16ToString(ptr: number): string
stringToUTF16(str: string, outPtr: number): void
UTF32ToString(ptr: number): string
stringToUTF32(str: string, outPtr: number): void
// USE_TYPED_ARRAYS == 1
HEAP: Int32Array
IHEAP: Int32Array
FHEAP: Float64Array
// USE_TYPED_ARRAYS == 2
HEAP8: Int8Array
HEAP16: Int16Array
HEAP32: Int32Array
HEAPU8: Uint8Array
HEAPU16: Uint16Array
HEAPU32: Uint32Array
HEAPF32: Float32Array
HEAPF64: Float64Array
TOTAL_STACK: number
TOTAL_MEMORY: number
FAST_MEMORY: number
addOnPreRun(cb: () => any): void
addOnInit(cb: () => any): void
addOnPreMain(cb: () => any): void
addOnExit(cb: () => any): void
addOnPostRun(cb: () => any): void
// Tools
intArrayFromString(
stringy: string,
dontAddNull?: boolean,
length?: number
): number[]
intArrayToString(array: number[]): string
writeStringToMemory(str: string, buffer: number, dontAddNull: boolean): void
writeArrayToMemory(array: number[], buffer: number): void
writeAsciiToMemory(str: string, buffer: number, dontAddNull: boolean): void
addRunDependency(id: any): void
removeRunDependency(id: any): void
preloadedImages: any
preloadedAudios: any
_malloc(size: number): number
_free(ptr: number): void
// Augmentations below by @surma.
onRuntimeInitialized: () => void | null
}
}

View file

@ -0,0 +1,31 @@
//
import { fileURLToPath } from 'node:url'
export function pathify(path: string): string {
if (path.startsWith('file://')) {
path = fileURLToPath(path)
}
return path
}
export function instantiateEmscriptenWasm<T extends EmscriptenWasm.Module>(
factory: EmscriptenWasm.ModuleFactory<T>,
path: string,
workerJS = ''
): Promise<T> {
return factory({
locateFile(requestPath) {
// The glue code generated by emscripten uses the original
// file names of the worker file and the wasm binary.
// These will have changed in the bundling process and
// we need to inject them here.
if (requestPath.endsWith('.wasm')) return pathify(path)
if (requestPath.endsWith('.worker.js')) return pathify(workerJS)
return requestPath
},
})
}
export function dirname(url: string) {
return url.substring(0, url.lastIndexOf('/'))
}

View file

@ -0,0 +1,139 @@
import { cpus } from 'os'
import { isMainThread } from 'node:worker_threads';
import WorkerPool from '../../utils/workerPool.js';
import type { Operation } from './image.js';
import * as impl from './impl.js';
import execOnce from '../../utils/execOnce.js';
import type { OutputFormat } from '../../loaders/index.js';
const getWorker = execOnce(
() => {
return new WorkerPool(
// There will be at most 7 workers needed since each worker will take
// at least 1 operation type.
Math.max(1, Math.min(cpus().length - 1, 7)),
'./node_modules/@astrojs/image/dist/vendor/squoosh/image-pool.js'
);
}
)
type DecodeParams = {
operation: 'decode',
buffer: Buffer
};
type ResizeParams = {
operation: 'resize',
imageData: ImageData,
height?: number,
width?: number
};
type RotateParams = {
operation: 'rotate',
imageData: ImageData,
numRotations: number
};
type EncodeAvifParams = {
operation: 'encodeavif',
imageData: ImageData,
quality: number
}
type EncodeJpegParams = {
operation: 'encodejpeg',
imageData: ImageData,
quality: number
}
type EncodePngParams = {
operation: 'encodepng',
imageData: ImageData
}
type EncodeWebpParams = {
operation: 'encodewebp',
imageData: ImageData,
quality: number
}
type JobMessage = DecodeParams | ResizeParams | RotateParams | EncodeAvifParams | EncodeJpegParams | EncodePngParams | EncodeWebpParams
function handleJob(params: JobMessage) {
switch (params.operation) {
case 'decode':
return impl.decodeBuffer(params.buffer)
case 'resize':
return impl.resize({ image: params.imageData as any, width: params.width, height: params.height })
case 'rotate':
return impl.rotate(params.imageData as any, params.numRotations);
case 'encodeavif':
return impl.encodeAvif(params.imageData as any, { quality: params.quality })
case 'encodejpeg':
return impl.encodeJpeg(params.imageData as any, { quality: params.quality })
case 'encodepng':
return impl.encodePng(params.imageData as any)
case 'encodewebp':
return impl.encodeWebp(params.imageData as any, { quality: params.quality })
default:
throw Error(`Invalid job "${(params as any).operation}"`);
}
}
export async function processBuffer(
buffer: Buffer,
operations: Operation[],
encoding: OutputFormat,
quality: number
): Promise<Uint8Array> {
// @ts-ignore
const worker = await getWorker()
let imageData = await worker.dispatchJob({
operation: 'decode',
buffer,
})
for (const operation of operations) {
if (operation.type === 'rotate') {
imageData = await worker.dispatchJob({
operation: 'rotate',
imageData,
numRotations: operation.numRotations
});
} else if (operation.type === 'resize') {
imageData = await worker.dispatchJob({
operation: 'resize',
imageData,
height: operation.height,
width: operation.width
})
}
}
switch (encoding) {
case 'avif':
return await worker.dispatchJob({
operation: 'encodeavif',
imageData,
quality: quality || 100
}) as Uint8Array;
case 'jpeg':
case 'jpg':
return await worker.dispatchJob({
operation: 'encodejpeg',
imageData,
quality: quality || 100,
}) as Uint8Array;
case 'png':
return await worker.dispatchJob({
operation: 'encodepng',
imageData,
}) as Uint8Array;
case 'webp':
return await worker.dispatchJob({
operation: 'encodewebp',
imageData,
quality: quality || 100,
}) as Uint8Array;
default:
throw Error(`Unsupported encoding format`)
}
}
if (!isMainThread) {
WorkerPool.useThisThreadAsWorker(handleJob);
}

View file

@ -0,0 +1,43 @@
import * as impl from './impl.js';
import type { OutputFormat } from '../../loaders/index.js';
type RotateOperation = {
type: 'rotate'
numRotations: number
}
type ResizeOperation = {
type: 'resize'
width?: number
height?: number
}
export type Operation = RotateOperation | ResizeOperation
export async function processBuffer(
buffer: Buffer,
operations: Operation[],
encoding: OutputFormat,
quality: number
): Promise<Uint8Array> {
let imageData = await impl.decodeBuffer(buffer)
for (const operation of operations) {
if (operation.type === 'rotate') {
imageData = await impl.rotate(imageData, operation.numRotations);
} else if (operation.type === 'resize') {
imageData = await impl.resize({ image: imageData, width: operation.width, height: operation.height })
}
}
switch (encoding) {
case 'avif':
return await impl.encodeAvif(imageData, { quality: quality }) as Uint8Array;
case 'jpeg':
case 'jpg':
return await impl.encodeJpeg(imageData, { quality }) as Uint8Array;
case 'png':
return await impl.encodePng(imageData) as Uint8Array;
case 'webp':
return await impl.encodeWebp(imageData, { quality }) as Uint8Array;
default:
throw Error(`Unsupported encoding format`)
}
}

View file

@ -0,0 +1,33 @@
export default class ImageData {
static from(input: ImageData): ImageData {
return new ImageData(input.data || input._data, input.width, input.height)
}
private _data: Buffer | Uint8Array | Uint8ClampedArray
width: number
height: number
get data(): Buffer {
if (Object.prototype.toString.call(this._data) === '[object Object]') {
return Buffer.from(Object.values(this._data))
}
if (
this._data instanceof Buffer ||
this._data instanceof Uint8Array ||
this._data instanceof Uint8ClampedArray
) {
return Buffer.from(this._data)
}
throw new Error('invariant')
}
constructor(
data: Buffer | Uint8Array | Uint8ClampedArray,
width: number,
height: number
) {
this._data = data
this.width = width
this.height = height
}
}

View file

@ -0,0 +1,135 @@
import { codecs as supportedFormats, preprocessors } from './codecs.js'
import ImageData from './image_data.js'
type EncoderKey = keyof typeof supportedFormats
const DELAY_MS = 1000
let _promise: Promise<void> | undefined
function delayOnce(ms: number): Promise<void> {
if (!_promise) {
_promise = new Promise((resolve) => {
setTimeout(resolve, ms)
})
}
return _promise
}
function maybeDelay(): Promise<void> {
const isAppleM1 = process.arch === 'arm64' && process.platform === 'darwin'
if (isAppleM1) {
return delayOnce(DELAY_MS)
}
return Promise.resolve()
}
export async function decodeBuffer(
_buffer: Buffer | Uint8Array
): Promise<ImageData> {
const buffer = Buffer.from(_buffer)
const firstChunk = buffer.slice(0, 16)
const firstChunkString = Array.from(firstChunk)
.map((v) => String.fromCodePoint(v))
.join('')
const key = Object.entries(supportedFormats).find(([, { detectors }]) =>
detectors.some((detector) => detector.exec(firstChunkString))
)?.[0] as EncoderKey | undefined
if (!key) {
throw Error(`Buffer has an unsupported format`)
}
const encoder = supportedFormats[key]
const mod = await encoder.dec()
const rgba = mod.decode(new Uint8Array(buffer))
// @ts-ignore
return rgba
}
export async function rotate(
image: ImageData,
numRotations: number
): Promise<ImageData> {
image = ImageData.from(image)
const m = await preprocessors['rotate'].instantiate()
return await m(image.data, image.width, image.height, { numRotations })
}
type ResizeOpts = { image: ImageData } & { width?: number; height?: number }
export async function resize({ image, width, height }: ResizeOpts) {
image = ImageData.from(image)
const p = preprocessors['resize']
const m = await p.instantiate()
await maybeDelay()
return await m(image.data, image.width, image.height, {
...p.defaultOptions,
width,
height,
})
}
export async function encodeJpeg(
image: ImageData,
{ quality }: { quality: number }
): Promise<Uint8Array> {
image = ImageData.from(image)
const e = supportedFormats['mozjpeg']
const m = await e.enc()
await maybeDelay()
const r = await m.encode(image.data, image.width, image.height, {
...e.defaultEncoderOptions,
quality,
})
return r
}
export async function encodeWebp(
image: ImageData,
{ quality }: { quality: number }
): Promise<Uint8Array> {
image = ImageData.from(image)
const e = supportedFormats['webp']
const m = await e.enc()
await maybeDelay()
const r = await m.encode(image.data, image.width, image.height, {
...e.defaultEncoderOptions,
quality,
})
return r
}
export async function encodeAvif(
image: ImageData,
{ quality }: { quality: number }
): Promise<Uint8Array> {
image = ImageData.from(image)
const e = supportedFormats['avif']
const m = await e.enc()
await maybeDelay()
const val = e.autoOptimize.min
const r = await m.encode(image.data, image.width, image.height, {
...e.defaultEncoderOptions,
// Think of cqLevel as the "amount" of quantization (0 to 62),
// so a lower value yields higher quality (0 to 100).
cqLevel: quality === 0 ? val : Math.round(val - (quality / 100) * val),
})
return r
}
export async function encodePng(
image: ImageData
): Promise<Uint8Array> {
image = ImageData.from(image)
const e = supportedFormats['oxipng']
const m = await e.enc()
await maybeDelay()
const r = await m.encode(image.data, image.width, image.height, {
...e.defaultEncoderOptions,
})
return r
}

View file

@ -0,0 +1,38 @@
// eslint-disable-next-line no-shadow
export const enum MozJpegColorSpace {
GRAYSCALE = 1,
RGB,
YCbCr,
}
export interface EncodeOptions {
quality: number
baseline: boolean
arithmetic: boolean
progressive: boolean
optimize_coding: boolean
smoothing: number
color_space: MozJpegColorSpace
quant_table: number
trellis_multipass: boolean
trellis_opt_zero: boolean
trellis_opt_table: boolean
trellis_loops: number
auto_subsample: boolean
chroma_subsample: number
separate_chroma_quality: boolean
chroma_quality: number
}
export interface MozJPEGModule extends EmscriptenWasm.Module {
encode(
data: BufferSource,
width: number,
height: number,
options: EncodeOptions
): Uint8Array
}
declare var moduleFactory: EmscriptenWasm.ModuleFactory<MozJPEGModule>
export default moduleFactory

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,120 @@
// @ts-nocheck
let wasm
let cachedTextDecoder = new TextDecoder('utf-8', {
ignoreBOM: true,
fatal: true,
})
cachedTextDecoder.decode()
let cachegetUint8Memory0 = null
function getUint8Memory0() {
if (
cachegetUint8Memory0 === null ||
cachegetUint8Memory0.buffer !== wasm.memory.buffer
) {
cachegetUint8Memory0 = new Uint8Array(wasm.memory.buffer)
}
return cachegetUint8Memory0
}
function getStringFromWasm0(ptr, len) {
return cachedTextDecoder.decode(getUint8Memory0().subarray(ptr, ptr + len))
}
let WASM_VECTOR_LEN = 0
function passArray8ToWasm0(arg, malloc) {
const ptr = malloc(arg.length * 1)
getUint8Memory0().set(arg, ptr / 1)
WASM_VECTOR_LEN = arg.length
return ptr
}
let cachegetInt32Memory0 = null
function getInt32Memory0() {
if (
cachegetInt32Memory0 === null ||
cachegetInt32Memory0.buffer !== wasm.memory.buffer
) {
cachegetInt32Memory0 = new Int32Array(wasm.memory.buffer)
}
return cachegetInt32Memory0
}
function getArrayU8FromWasm0(ptr, len) {
return getUint8Memory0().subarray(ptr / 1, ptr / 1 + len)
}
/**
* @param {Uint8Array} data
* @param {number} level
* @param {boolean} interlace
* @returns {Uint8Array}
*/
export function optimise(data, level, interlace) {
try {
const retptr = wasm.__wbindgen_add_to_stack_pointer(-16)
const ptr0 = passArray8ToWasm0(data, wasm.__wbindgen_malloc)
const len0 = WASM_VECTOR_LEN
wasm.optimise(retptr, ptr0, len0, level, interlace)
const r0 = getInt32Memory0()[retptr / 4 + 0]
const r1 = getInt32Memory0()[retptr / 4 + 1]
const v1 = getArrayU8FromWasm0(r0, r1).slice()
wasm.__wbindgen_free(r0, r1 * 1)
return v1
} finally {
wasm.__wbindgen_add_to_stack_pointer(16)
}
}
async function load(module, imports) {
if (typeof Response === 'function' && module instanceof Response) {
if (typeof WebAssembly.instantiateStreaming === 'function') {
return await WebAssembly.instantiateStreaming(module, imports)
}
const bytes = await module.arrayBuffer()
return await WebAssembly.instantiate(bytes, imports)
} else {
const instance = await WebAssembly.instantiate(module, imports)
if (instance instanceof WebAssembly.Instance) {
return { instance, module }
} else {
return instance
}
}
}
async function init(input) {
const imports = {}
imports.wbg = {}
imports.wbg.__wbindgen_throw = function (arg0, arg1) {
throw new Error(getStringFromWasm0(arg0, arg1))
}
if (
typeof input === 'string' ||
(typeof Request === 'function' && input instanceof Request) ||
(typeof URL === 'function' && input instanceof URL)
) {
input = fetch(input)
}
const { instance, module } = await load(await input, imports)
wasm = instance.exports
init.__wbindgen_wasm_module = module
return wasm
}
export default init
// Manually remove the wasm and memory references to trigger GC
export function cleanup() {
wasm = null
cachegetUint8Memory0 = null
cachegetInt32Memory0 = null
}

View file

@ -0,0 +1,184 @@
// @ts-nocheck
let wasm
let cachedTextDecoder = new TextDecoder('utf-8', {
ignoreBOM: true,
fatal: true,
})
cachedTextDecoder.decode()
let cachegetUint8Memory0 = null
function getUint8Memory0() {
if (
cachegetUint8Memory0 === null ||
cachegetUint8Memory0.buffer !== wasm.memory.buffer
) {
cachegetUint8Memory0 = new Uint8Array(wasm.memory.buffer)
}
return cachegetUint8Memory0
}
function getStringFromWasm0(ptr, len) {
return cachedTextDecoder.decode(getUint8Memory0().subarray(ptr, ptr + len))
}
let cachegetUint8ClampedMemory0 = null
function getUint8ClampedMemory0() {
if (
cachegetUint8ClampedMemory0 === null ||
cachegetUint8ClampedMemory0.buffer !== wasm.memory.buffer
) {
cachegetUint8ClampedMemory0 = new Uint8ClampedArray(wasm.memory.buffer)
}
return cachegetUint8ClampedMemory0
}
function getClampedArrayU8FromWasm0(ptr, len) {
return getUint8ClampedMemory0().subarray(ptr / 1, ptr / 1 + len)
}
const heap = new Array(32).fill(undefined)
heap.push(undefined, null, true, false)
let heap_next = heap.length
function addHeapObject(obj) {
if (heap_next === heap.length) heap.push(heap.length + 1)
const idx = heap_next
heap_next = heap[idx]
heap[idx] = obj
return idx
}
let WASM_VECTOR_LEN = 0
function passArray8ToWasm0(arg, malloc) {
const ptr = malloc(arg.length * 1)
getUint8Memory0().set(arg, ptr / 1)
WASM_VECTOR_LEN = arg.length
return ptr
}
let cachegetInt32Memory0 = null
function getInt32Memory0() {
if (
cachegetInt32Memory0 === null ||
cachegetInt32Memory0.buffer !== wasm.memory.buffer
) {
cachegetInt32Memory0 = new Int32Array(wasm.memory.buffer)
}
return cachegetInt32Memory0
}
function getArrayU8FromWasm0(ptr, len) {
return getUint8Memory0().subarray(ptr / 1, ptr / 1 + len)
}
/**
* @param {Uint8Array} data
* @param {number} width
* @param {number} height
* @returns {Uint8Array}
*/
export function encode(data, width, height) {
try {
const retptr = wasm.__wbindgen_add_to_stack_pointer(-16)
const ptr0 = passArray8ToWasm0(data, wasm.__wbindgen_malloc)
const len0 = WASM_VECTOR_LEN
wasm.encode(retptr, ptr0, len0, width, height)
const r0 = getInt32Memory0()[retptr / 4 + 0]
const r1 = getInt32Memory0()[retptr / 4 + 1]
const v1 = getArrayU8FromWasm0(r0, r1).slice()
wasm.__wbindgen_free(r0, r1 * 1)
return v1
} finally {
wasm.__wbindgen_add_to_stack_pointer(16)
}
}
function getObject(idx) {
return heap[idx]
}
function dropObject(idx) {
if (idx < 36) return
heap[idx] = heap_next
heap_next = idx
}
function takeObject(idx) {
const ret = getObject(idx)
dropObject(idx)
return ret
}
/**
* @param {Uint8Array} data
* @returns {ImageData}
*/
export function decode(data) {
const ptr0 = passArray8ToWasm0(data, wasm.__wbindgen_malloc)
const len0 = WASM_VECTOR_LEN
const ret = wasm.decode(ptr0, len0)
return takeObject(ret)
}
async function load(module, imports) {
if (typeof Response === 'function' && module instanceof Response) {
if (typeof WebAssembly.instantiateStreaming === 'function') {
return await WebAssembly.instantiateStreaming(module, imports)
}
const bytes = await module.arrayBuffer()
return await WebAssembly.instantiate(bytes, imports)
} else {
const instance = await WebAssembly.instantiate(module, imports)
if (instance instanceof WebAssembly.Instance) {
return { instance, module }
} else {
return instance
}
}
}
async function init(input) {
const imports = {}
imports.wbg = {}
imports.wbg.__wbg_newwithownedu8clampedarrayandsh_787b2db8ea6bfd62 =
function (arg0, arg1, arg2, arg3) {
const v0 = getClampedArrayU8FromWasm0(arg0, arg1).slice()
wasm.__wbindgen_free(arg0, arg1 * 1)
const ret = new ImageData(v0, arg2 >>> 0, arg3 >>> 0)
return addHeapObject(ret)
}
imports.wbg.__wbindgen_throw = function (arg0, arg1) {
throw new Error(getStringFromWasm0(arg0, arg1))
}
if (
typeof input === 'string' ||
(typeof Request === 'function' && input instanceof Request) ||
(typeof URL === 'function' && input instanceof URL)
) {
input = fetch(input)
}
const { instance, module } = await load(await input, imports)
wasm = instance.exports
init.__wbindgen_wasm_module = module
return wasm
}
export default init
// Manually remove the wasm and memory references to trigger GC
export function cleanup() {
wasm = null
cachegetUint8ClampedMemory0 = null
cachegetUint8Memory0 = null
cachegetInt32Memory0 = null
}

View file

@ -0,0 +1,141 @@
// @ts-nocheck
let wasm
let cachegetUint8Memory0 = null
function getUint8Memory0() {
if (
cachegetUint8Memory0 === null ||
cachegetUint8Memory0.buffer !== wasm.memory.buffer
) {
cachegetUint8Memory0 = new Uint8Array(wasm.memory.buffer)
}
return cachegetUint8Memory0
}
let WASM_VECTOR_LEN = 0
function passArray8ToWasm0(arg, malloc) {
const ptr = malloc(arg.length * 1)
getUint8Memory0().set(arg, ptr / 1)
WASM_VECTOR_LEN = arg.length
return ptr
}
let cachegetInt32Memory0 = null
function getInt32Memory0() {
if (
cachegetInt32Memory0 === null ||
cachegetInt32Memory0.buffer !== wasm.memory.buffer
) {
cachegetInt32Memory0 = new Int32Array(wasm.memory.buffer)
}
return cachegetInt32Memory0
}
let cachegetUint8ClampedMemory0 = null
function getUint8ClampedMemory0() {
if (
cachegetUint8ClampedMemory0 === null ||
cachegetUint8ClampedMemory0.buffer !== wasm.memory.buffer
) {
cachegetUint8ClampedMemory0 = new Uint8ClampedArray(wasm.memory.buffer)
}
return cachegetUint8ClampedMemory0
}
function getClampedArrayU8FromWasm0(ptr, len) {
return getUint8ClampedMemory0().subarray(ptr / 1, ptr / 1 + len)
}
/**
* @param {Uint8Array} input_image
* @param {number} input_width
* @param {number} input_height
* @param {number} output_width
* @param {number} output_height
* @param {number} typ_idx
* @param {boolean} premultiply
* @param {boolean} color_space_conversion
* @returns {Uint8ClampedArray}
*/
export function resize(
input_image,
input_width,
input_height,
output_width,
output_height,
typ_idx,
premultiply,
color_space_conversion
) {
try {
const retptr = wasm.__wbindgen_add_to_stack_pointer(-16)
const ptr0 = passArray8ToWasm0(input_image, wasm.__wbindgen_malloc)
const len0 = WASM_VECTOR_LEN
wasm.resize(
retptr,
ptr0,
len0,
input_width,
input_height,
output_width,
output_height,
typ_idx,
premultiply,
color_space_conversion
)
const r0 = getInt32Memory0()[retptr / 4 + 0]
const r1 = getInt32Memory0()[retptr / 4 + 1]
const v1 = getClampedArrayU8FromWasm0(r0, r1).slice()
wasm.__wbindgen_free(r0, r1 * 1)
return v1
} finally {
wasm.__wbindgen_add_to_stack_pointer(16)
}
}
async function load(module, imports) {
if (typeof Response === 'function' && module instanceof Response) {
if (typeof WebAssembly.instantiateStreaming === 'function') {
return await WebAssembly.instantiateStreaming(module, imports)
}
const bytes = await module.arrayBuffer()
return await WebAssembly.instantiate(bytes, imports)
} else {
const instance = await WebAssembly.instantiate(module, imports)
if (instance instanceof WebAssembly.Instance) {
return { instance, module }
} else {
return instance
}
}
}
async function init(input) {
const imports = {}
if (
typeof input === 'string' ||
(typeof Request === 'function' && input instanceof Request) ||
(typeof URL === 'function' && input instanceof URL)
) {
input = fetch(input)
}
const { instance, module } = await load(await input, imports)
wasm = instance.exports
init.__wbindgen_wasm_module = module
return wasm
}
export default init
// Manually remove the wasm and memory references to trigger GC
export function cleanup() {
wasm = null
cachegetUint8Memory0 = null
cachegetInt32Memory0 = null
}

Binary file not shown.

View file

@ -0,0 +1,42 @@
export interface EncodeOptions {
quality: number
target_size: number
target_PSNR: number
method: number
sns_strength: number
filter_strength: number
filter_sharpness: number
filter_type: number
partitions: number
segments: number
pass: number
show_compressed: number
preprocessing: number
autofilter: number
partition_limit: number
alpha_compression: number
alpha_filtering: number
alpha_quality: number
lossless: number
exact: number
image_hint: number
emulate_jpeg_size: number
thread_level: number
low_memory: number
near_lossless: number
use_delta_palette: number
use_sharp_yuv: number
}
export interface WebPModule extends EmscriptenWasm.Module {
encode(
data: BufferSource,
width: number,
height: number,
options: EncodeOptions
): Uint8Array
}
declare var moduleFactory: EmscriptenWasm.ModuleFactory<WebPModule>
export default moduleFactory

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -8,7 +8,6 @@ import slash from 'slash';
import type { Plugin, ResolvedConfig } from 'vite';
import type { IntegrationOptions } from './index.js';
import type { InputFormat } from './loaders/index.js';
import sharp from './loaders/sharp.js';
import { metadata } from './utils/metadata.js';
export interface ImageMetadata {
@ -90,13 +89,13 @@ export function createPlugin(config: AstroConfig, options: Required<IntegrationO
return next();
}
const transform = await sharp.parseTransform(url.searchParams);
const transform = await globalThis.astroImage.defaultLoader.parseTransform(url.searchParams);
if (!transform) {
return next();
}
const result = await sharp.transform(file, transform);
const result = await globalThis.astroImage.defaultLoader.transform(file, transform);
res.setHeader('Content-Type', `image/${result.format}`);
res.setHeader('Cache-Control', 'max-age=360000');

View file

@ -4,5 +4,5 @@ import image from '@astrojs/image';
// https://astro.build/config
export default defineConfig({
site: 'http://localhost:3000',
integrations: [image({ logLevel: 'silent' })]
integrations: [image({ logLevel: 'silent', serviceEntryPoint: '@astrojs/image/sharp' })]
});

View file

@ -5,6 +5,7 @@
"dependencies": {
"@astrojs/image": "workspace:*",
"@astrojs/node": "workspace:*",
"astro": "workspace:*"
"astro": "workspace:*",
"sharp": "^0.31.0"
}
}

View file

@ -4,5 +4,5 @@ import image from '@astrojs/image';
// https://astro.build/config
export default defineConfig({
site: 'http://localhost:3000',
integrations: [image({ logLevel: 'silent' })]
integrations: [image({ logLevel: 'silent', serviceEntryPoint: '@astrojs/image/sharp' })]
});

View file

@ -5,6 +5,7 @@
"dependencies": {
"@astrojs/image": "workspace:*",
"@astrojs/node": "workspace:*",
"astro": "workspace:*"
"astro": "workspace:*",
"sharp": "^0.31.0"
}
}

View file

@ -23,6 +23,6 @@ import { Image } from '@astrojs/image/components';
<br />
<Image id="bg-color" src="https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png" width={544} height={184} format="jpeg" alt="Google" background="#333333" />
<br />
<Image id="ipsum" src="https://picsum.photos/200/300" width={200} height={300} alt="ipsum" format="jpeg" />
<Image id="ipsum" src="https://dummyimage.com/200x300" width={200} height={300} alt="ipsum" format="jpeg" />
</body>
</html>

View file

@ -4,5 +4,5 @@ import image from '@astrojs/image';
// https://astro.build/config
export default defineConfig({
site: 'http://localhost:3000',
integrations: [image({ logLevel: 'silent' })]
integrations: [image({ logLevel: 'silent', serviceEntryPoint: '@astrojs/image/sharp' })]
});

View file

@ -5,6 +5,7 @@
"dependencies": {
"@astrojs/image": "workspace:*",
"@astrojs/node": "workspace:*",
"astro": "workspace:*"
"astro": "workspace:*",
"sharp": "^0.31.0"
}
}

View file

@ -18,6 +18,6 @@ import { Picture } from '@astrojs/image/components';
<br />
<Picture id="bg-color" src="https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png" sizes="(min-width: 640px) 50vw, 100vw" widths={[272, 544]} aspectRatio={544/184} alt="Google logo" background="rgb(51, 51, 51)" formats={['avif', 'jpeg']} />
<br />
<Picture id="ipsum" src="https://picsum.photos/200/300" sizes="100vw" widths={[100, 200]} aspectRatio={2/3} formats={["avif", "webp", "jpg"]} alt="ipsum" />
<Picture id="ipsum" src="https://dummyimage.com/200x300" sizes="100vw" widths={[100, 200]} aspectRatio={2/3} formats={["avif", "webp", "jpg"]} alt="ipsum" />
</body>
</html>

View file

@ -4,5 +4,5 @@ import image from '@astrojs/image';
// https://astro.build/config
export default defineConfig({
site: 'http://localhost:3000',
integrations: [image({ logLevel: 'silent' })]
integrations: [image({ logLevel: 'silent', serviceEntryPoint: '@astrojs/image/sharp' })]
});

View file

@ -5,6 +5,7 @@
"dependencies": {
"@astrojs/image": "workspace:*",
"@astrojs/node": "workspace:*",
"astro": "workspace:*"
"astro": "workspace:*",
"sharp": "^0.31.0"
}
}

View file

@ -4,5 +4,5 @@ import image from '@astrojs/image';
// https://astro.build/config
export default defineConfig({
site: 'http://localhost:3000',
integrations: [image({ logLevel: 'silent' })]
integrations: [image({ logLevel: 'silent', serviceEntryPoint: '@astrojs/image/sharp' })]
});

View file

@ -5,6 +5,7 @@
"dependencies": {
"@astrojs/image": "workspace:*",
"@astrojs/node": "workspace:*",
"astro": "workspace:*"
"astro": "workspace:*",
"sharp": "^0.31.0"
}
}

View file

@ -4,5 +4,5 @@ import image from '@astrojs/image';
// https://astro.build/config
export default defineConfig({
site: 'http://localhost:3000',
integrations: [image({ logLevel: 'silent' })]
integrations: [image({ logLevel: 'silent', serviceEntryPoint: '@astrojs/image/sharp' })]
});

View file

@ -5,6 +5,7 @@
"dependencies": {
"@astrojs/image": "workspace:*",
"@astrojs/node": "workspace:*",
"astro": "workspace:*"
"astro": "workspace:*",
"sharp": "^0.31.0"
}
}

View file

@ -0,0 +1,8 @@
import { defineConfig } from 'astro/config';
import image from '@astrojs/image';
// https://astro.build/config
export default defineConfig({
site: 'http://localhost:3000',
integrations: [image({ logLevel: 'silent' })]
});

View file

@ -0,0 +1,10 @@
{
"name": "@test/basic-image",
"version": "0.0.0",
"private": true,
"dependencies": {
"@astrojs/image": "workspace:*",
"@astrojs/node": "workspace:*",
"astro": "workspace:*"
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 270 KiB

View file

@ -0,0 +1,44 @@
import { createServer } from 'http';
import fs from 'fs';
import mime from 'mime';
import { handler as ssrHandler } from '../dist/server/entry.mjs';
const clientRoot = new URL('../dist/client/', import.meta.url);
async function handle(req, res) {
ssrHandler(req, res, async (err) => {
if (err) {
res.writeHead(500);
res.end(err.stack);
return;
}
let local = new URL('.' + req.url, clientRoot);
try {
const data = await fs.promises.readFile(local);
res.writeHead(200, {
'Content-Type': mime.getType(req.url),
});
res.end(data);
} catch {
res.writeHead(404);
res.end();
}
});
}
const server = createServer((req, res) => {
handle(req, res).catch((err) => {
console.error(err);
res.writeHead(500, {
'Content-Type': 'text/plain',
});
res.end(err.toString());
});
});
server.listen(8085);
console.log('Serving at http://localhost:8085');
// Silence weird <time> warning
console.error = () => {};

Binary file not shown.

After

Width:  |  Height:  |  Size: 270 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.4 MiB

View file

@ -0,0 +1,18 @@
---
import socialJpg from '../assets/social.jpg';
import introJpg from '../assets/blog/introducing astro.jpg';
import { Image } from '@astrojs/image/components';
---
<html>
<head>
<!-- Head Stuff -->
</head>
<body>
<Image id="hero" src="/hero.jpg" width={768} height={414} format="webp" alt="hero" />
<br />
<Image id="social-jpg" src={socialJpg} width={506} height={253} alt="social-jpg" />
<br />
<Image id="google" src="https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png" width={544} height={184} format="webp" alt="Google" />
</body>
</html>

View file

@ -5,5 +5,5 @@ import mdx from '@astrojs/mdx';
// https://astro.build/config
export default defineConfig({
site: 'http://localhost:3000',
integrations: [image({ logLevel: 'silent' }), mdx()]
integrations: [image({ logLevel: 'silent', serviceEntryPoint: '@astrojs/image/sharp' }), mdx()]
});

View file

@ -6,6 +6,7 @@
"@astrojs/image": "workspace:*",
"@astrojs/mdx": "workspace:*",
"@astrojs/node": "workspace:*",
"astro": "workspace:*"
"astro": "workspace:*",
"sharp": "^0.31.0"
}
}

View file

@ -57,7 +57,7 @@ describe('SSG images - dev', function () {
query: {
w: '200',
h: '300',
href: 'https://picsum.photos/200/300',
href: 'https://dummyimage.com/200x300',
},
},
{
@ -150,7 +150,7 @@ describe('SSG images with subpath - dev', function () {
query: {
w: '200',
h: '300',
href: 'https://picsum.photos/200/300',
href: 'https://dummyimage.com/200x300',
},
},
{
@ -237,7 +237,7 @@ describe('SSG images - build', function () {
{
title: 'Remote without file extension',
id: '#ipsum',
regex: /^\/assets\/300_\w{4,10}/,
regex: /^\/assets\/200x300_\w{4,10}/,
size: { width: 200, height: 300, type: 'jpg' },
},
{
@ -313,7 +313,7 @@ describe('SSG images with subpath - build', function () {
{
title: 'Remote without file extension',
id: '#ipsum',
regex: /^\/docs\/assets\/300_\w{4,10}/,
regex: /^\/docs\/assets\/200x300_\w{4,10}/,
size: { width: 200, height: 300, type: 'jpg' },
},
{

View file

@ -52,7 +52,7 @@ describe('SSR images - build', async function () {
query: {
w: '200',
h: '300',
href: 'https://picsum.photos/200/300',
href: 'https://dummyimage.com/200x300',
},
},
{
@ -168,7 +168,7 @@ describe('SSR images with subpath - build', function () {
query: {
w: '200',
h: '300',
href: 'https://picsum.photos/200/300',
href: 'https://dummyimage.com/200x300',
},
},
{

View file

@ -65,7 +65,7 @@ describe('SSR images - dev', function () {
query: {
w: '200',
h: '300',
href: 'https://picsum.photos/200/300',
href: 'https://dummyimage.com/200x300',
},
contentType: 'image/jpeg',
},
@ -180,7 +180,7 @@ describe('SSR images with subpath - dev', function () {
query: {
w: '200',
h: '300',
href: 'https://picsum.photos/200/300',
href: 'https://dummyimage.com/200x300',
},
contentType: 'image/jpeg',
},

View file

@ -216,7 +216,7 @@ describe('SSG pictures - build', function () {
{
title: 'Remote without file extension',
id: '#ipsum',
regex: /^\/assets\/300_\w{4,10}/,
regex: /^\/assets\/200x300_\w{4,10}/,
size: { width: 200, height: 300, type: 'jpg' },
alt: 'ipsum',
},
@ -311,7 +311,7 @@ describe('SSG pictures with subpath - build', function () {
{
title: 'Remote without file extension',
id: '#ipsum',
regex: /^\/docs\/assets\/300_\w{4,10}/,
regex: /^\/docs\/assets\/200x300_\w{4,10}/,
size: { width: 200, height: 300, type: 'jpg' },
alt: 'ipsum',
},

View file

@ -49,7 +49,7 @@ describe('SSR pictures - build', function () {
query: {
w: '200',
h: '300',
href: 'https://picsum.photos/200/300',
href: 'https://dummyimage.com/200x300',
},
alt: 'ipsum',
},
@ -153,7 +153,7 @@ describe('SSR pictures with subpath - build', function () {
query: {
w: '200',
h: '300',
href: 'https://picsum.photos/200/300',
href: 'https://dummyimage.com/200x300',
},
alt: 'ipsum',
},

View file

@ -62,7 +62,7 @@ describe('SSR pictures - dev', function () {
f: 'jpg',
w: '200',
h: '300',
href: 'https://picsum.photos/200/300',
href: 'https://dummyimage.com/200x300',
},
contentType: 'image/jpeg',
alt: 'ipsum',
@ -183,7 +183,7 @@ describe('SSR pictures with subpath - dev', function () {
f: 'jpg',
w: '200',
h: '300',
href: 'https://picsum.photos/200/300',
href: 'https://dummyimage.com/200x300',
},
contentType: 'image/jpeg',
alt: 'ipsum',

View file

@ -0,0 +1,61 @@
import { expect } from 'chai';
import * as cheerio from 'cheerio';
import { loadFixture } from './test-utils.js';
describe('Squoosh service', function () {
let fixture;
let devServer;
let $;
before(async () => {
fixture = await loadFixture({ root: './fixtures/basic-image/' });
devServer = await fixture.startDevServer();
const html = await fixture.fetch('/').then((res) => res.text());
$ = cheerio.load(html);
});
after(async () => {
await devServer.stop();
});
[
{
title: 'Local images',
id: '#social-jpg',
url: '/@astroimage/assets/social.jpg',
query: { f: 'jpg', w: '506', h: '253' },
},
{
title: 'Remote images',
id: '#google',
url: '/_image',
query: {
f: 'webp',
w: '544',
h: '184',
href: 'https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png',
},
},
{
title: 'Public images',
id: '#hero',
url: '/_image',
query: { f: 'webp', w: '768', h: '414', href: '/hero.jpg' },
},
].forEach(({ title, id, url, query }) => {
it(title, () => {
const image = $(id);
const src = image.attr('src');
const [route, params] = src.split('?');
expect(route).to.equal(url);
const searchParams = new URLSearchParams(params);
for (const [key, value] of Object.entries(query)) {
expect(searchParams.get(key)).to.equal(value);
}
});
});
});

File diff suppressed because it is too large Load diff