Building Optimize CLI tool

Back in September, I had this idea for a project (a launch for another time). There I was, building another microsite. Run it locally. Everything looks perfect. Push to production...

Broken Open Graph tags. Alt text missing on half the images. No Schema markup. URLs that reference the dev environment. And a critical page that has noindex from testing.

That's the thing about meta tags, schema markup, and accessibility attributes they're invisible until they're not. Until you push to production. Run the schema.org validator. Open Chrome DevTools and use Lighthouse. Someone shares your link and you don't see that pretty social image. A screen reader encounters your page. Or Google Search Console says you've got a bunch of 404 pages.

Who's got time to be missing key things when we're all trying to keep up with changes in AI? I just wanted to build a static site that technically had a fighting chance to stand out in generative and traditional search.

Why Not Just Use Lighthouse?

Lighthouse is great. But it's solving a different problem.

It boots up headless Chrome, executes JavaScript, measures Core Web Vitals, captures screenshots, analyzes runtime performance... For a static site where I just need to know "are my meta tags correct?" or "does the schema JSON exist?", it's overkill during development.

Have you ever tried auditing a 50-page site? With Lighthouse you can only audit 1 page at a time. And if you want to run batch audits there's Unlighthouse, which is pretty cool. The downside is it has the same dependencies as Lighthouse. So all the above is true.

And if you're working in a resource-constrained environment (not by choice but by circumstance), every node_module and dependency counts. Running Lighthouse locally was not an option. Imagine building on a 20GB flash drive. How can you efficiently test a site before it makes it to production—and without causing your system to crash?

Creative problem solving

Then I thought, why not just test the build folder?

What started out as just a few simple Node.js script checks turned into something I could customize for any project. Space was no longer a problem. And what took minutes, I was able to do in seconds.

Feedback loop resolved. Code, test (errors), fix, test (pass), push.

Constraints breed creativity

The What?

A modular and extensible post-build and continuous CLI-first static site validator and diagnostic tool that you can plug into any environment. There are 8 core checks:

Each check is built as a module that can be configured for different project requirements via the configuration file. It scans 50 pages in seconds. That's fast enough to run on every build and get instant feedback while you're building.

It's 38.4KB unpacked for core engine. No dependencies are used other than Changesets for versioning, pnpm for modular repo management, and Node.js.

No bloated dependency trees. No hidden framework requirements. Just focused tooling that you can extend as needed. Run it as a dev dependency, include it in your agentic or CI/CD workflow, or adapt it and make your own tools with extensions or custom checks that plug into the core engine.

I decided to call it Optimize, and it's a project I'm working on at NOVL. If you're building static sites—whether with Eleventy, Hugo, Astro, Next.js static exports, or Jekyll—this is for you.

Try it out

Currently it's in beta and I'd love your feedback. The plan is to open source it, but for now, real feedback matters. It solved a problem I kept running into. Maybe it'll solve yours too.

Install the package:

# Install on your project
npm install -D @bynovl/optimize 

Set up your config (optimize.config.js):

export default {
  outDir: 'dist', // required
  ignore: ['404.html', '403.html', '402.html'],
}

Start testing:

# Runs a full audit
npx optimize

Join me

Come join me in optimizing the web. Leaving no site behind. Sign up for the beta.

Back to top