JavaScript: Weekly Summary (October 06-12, 2025)

Key trends, opinions and insights from personal blogs

I keep bumping into two little truths about JavaScript this week. First: people still use it for everything from digging through a school app to emulating 1980s CP/M machines in the browser. Second: the conversation keeps circling back to friction — compilation time, messy HTML, scraping defenses, and the small frictions that make or break a tiny project. I would describe this batch of posts as a grab bag of practical hacks, nostalgia, and reminders that the web is still this messy, brilliant, DIY place.

Small tools that do big emotional work

There’s a post by Bin Wang about Seesaw, the kindergarten/daycare app. The story on the surface is simple: the author wants to save all the photos of their kid before the account goes away, and Seesaw’s download tools don’t cut it. So they wrote a JavaScript scraper to pull all the raw images from the site. That’s it — a small script. But it’s also very human. To me, it feels like watching someone build a little rowboat to bring home something precious.

Bin walks through the usual annoyances: messy HTML, a UI that gives you the finger when you try to bulk-export, and the weird emotional dimension of watching your kid grow through a gallery of photos. The technical bits are plain and useful: how to find image URLs, how to iterate over messages, how to cope with rate limits. But the meat isn’t the code. The meat is that small code doing heavy emotional lifting.

I’d say this captures one of JavaScript’s powers: it’s cheap and fast to hook into a site and make something that matters to one person. It’s like baking a loaf at home because store bread tastes the same but doesn’t have the story. You’ll want to click over to see the code snippets and the little hacks — the author shares the rough edges and that’s useful.

When JavaScript reaches into the guts: WebGPU and shaders

On a very different scale, Hugo Daniel is poking at WebGPU and WGSL. He built an app where you tune sliders to add complexity to shaders and then watch what happens to compile times. That’s nerd candy for people who care about frames, latency, and the moment a pipeline stalls while the GPU thinks.

The takeaway isn’t fancy. It’s: shader compile time matters. A rotating triangle compiles fast. Add branching, loops, or big math and compile time climbs. Hugo calls out cold vs warm compilation — which is the sort of detail you only notice when you’re trying to ship something that doesn’t stutter on first-run. The tool also spits out device/adapter info, which is a nice reminder that vendor limits are often different in practice than in docs.

To me, this is the kind of post that whispers: don’t be smug about WebGPU. It’s powerful, but there are non-obvious costs. Think of it like buying a sports car: you get speed, but you also get higher taxes, higher insurance, and the need to find the right petrol.

There’s a curiosity-thread here. Hugo wants to measure cold vs warm compilation more rigorously and hints at future rendering-engine work. If you tinker with graphics at all, this is worth a detour.

Node.js keeps moving features from npm into core

Stefan Judis posts a short, cheerful note: Node.js now has a native glob utility (since v22.17). Three flavors: fsPromises.glob(), fs.glob(), and fs.globSync(). This is the kind of change that feels small but unclogs a hundred tiny scripts.

I’d describe this as “finally” but the word is too dramatic. It’s more like when you find the right drawer for your forks and suddenly your kitchen looks nicer. People have depended on glob packages for years — the combined npm downloads are enormous — and now you can skip an external dep for quick file ops.

This touches a theme: the platform keeps absorbing functionality that was once a community add-on. That reduces friction, which always makes smaller projects happier.

Reuse and restraint: the front-end hygiene sermon

Chris Ferdinandi writes about creating reusable code, especially in UI-heavy apps. The post is practical and a bit stern in a good way. He points at the usual suspects: repeated DOM structures, duplicated CSS, and widget-y code clones. The prescription is familiar — components, shared CSS classes, small APIs — but the voice is hands-on, not academic.

What stood out was how he frames reuse as a user experience issue, not just developer laziness. If you copy-paste a modal a dozen times, you’ll also copy-paste accessibility mistakes. He’s pushing the point that reuse is a kindness to future-you and future-users.

To me, it feels like a reminder that the smallest design choices ripple outward. Don’t treat HTML and CSS like disposable napkins. Give them a bit of care.

Modern CSS vs. the React-ification of everything

Rach Smith took a different tack: they built a small site for a friend after 14 years away from web dev and were struck by how far CSS has come. Things like custom properties and display: grid make layouts that used to be a fight into a couple of tidy rules. Rach contrasts the pleasure of this small, low-dependency site with the modern trend: clients wanting low-code platforms or big React stacks.

This is one of those posts that smells faintly of nostalgia — not for the sake of nostalgia, but because the simpler tools work for many cases. Rach says “Who needs a flying car when you have display: grid.” That’s not a dig at progress; it’s an argument that sometimes basic CSS is the right tool.

There’s a pattern here with Chris’s piece: both authors tilt toward simplicity and reuse. Both seem tired of seeing frameworks used like sledgehammers for thumbtacks.

Retro projects and the joy of building odd things in JS

Two posts this week are pure nostalgia and tinkering. Unlisted Retrograde Holdout built TRACKorDIE`94 — a React web app to play classic tracker music with a 90s aesthetic. The project is loud in personality: playlist support, vintage UI, and the author’s grumpy delight in the quirks. They admit bugs and ask for hosting help. It’s one of those projects you’d fork just for the vibe.

Then Chris Garrett finally gets a full CP/M emulator working in the browser. This is deep nerd territory: Z80 emulation, Microsoft BASIC, Star Trek, and the pain of compiling C on a 1970s-style OS. The breakthrough came when the author integrated a compact CP/M80 emulator (MyZ80) and added file import/export. Now the browser can boot old software, and the author gives a quick start guide.

I would describe these posts as evidence that JavaScript is the medium of hobbyist hardware nostalgia. It’s the modern equivalent of soldering LEDs to a perfboard in your garage. These projects are unapologetically for lovers. They’re not shifts in enterprise architecture. They are tiny, stubborn islands of joy.

Scripts vs systems: scraping, anti-scraping, and the web’s new etiquette

Michał Sapka ran three posts that are basically parts of the same conversation about Anubis, a server-side anti-scraping tool that uses a proof-of-work scheme and modern JavaScript features to fend off aggressive scraping by AI outfits. One version is called Alien: Earth (Season 1), another is Prague, Czech Republic, and the third references Philip K. Dick — but they’re the same idea repeated with small variations.

The gist: Anubis intentionally forces clients to solve small computational puzzles to access pages, which makes mass scraping expensive but still keeps legitimate human users moving. It also checks for modern JavaScript features so headless browsers that don’t run JS get stuck.

This raised a few hairs for me. I’d say it feels like watching a neighborhood install speed bumps to slow down joyriders. On the one hand, site owners are defending their content, which is fair. On the other hand, these measures can make life harder for accessibility tools, old devices, and some legitimate crawlers.

Michał’s writeups aren't just technical; they’re a little philosophical. He notes that Anubis requires modern JS, and that can be broken by privacy plugins or headless setups. That’s a tension: protecting content versus preserving an open web that works for everyone. The posts hint at trade-offs and edge cases — the kind you'll only notice when you’re building a scraper or an archive.

Recurring themes I noticed

  • JavaScript as practical glue: People use it to solve real, immediate problems. Bin’s image downloader, Chris Garrett’s emulator file handling, and Stefan’s nod to native globbing are all about making small tasks smoother. I’d say JavaScript still wins at low-friction tooling.

  • Friction is the story: friction shows up as compile times in WebGPU, messy DOM in scraping, duplicated code in UIs, and site defenses like Anubis. Authors are either reducing friction (reuse, native APIs) or pointing out where friction bites (shader compiles, anti-scraping).

  • Nostalgia and hobby projects keep the ecosystem lively. TRACKorDIE`94 and the CP/M emulator are not trying to be scalable products. They’re like building a model railroad in your spare room. They remind us that JavaScript is as much for play as it is for production.

  • Platform consolidation: Node pulling in globbing is a quiet trend. Platform APIs keep eating the things we had to install from npm. That matters for low-dependency scripts and for people who just want to get things done without dependency management noise.

  • Simplicity vs. complexity: Rach and Chris push for simpler CSS-first designs and component reuse. Others, like Hugo, point to complexity that can’t be avoided if you want high performance in graphics. The argument isn’t binary — it’s about picking the right tool for the job.

  • Defense and governance: Michał’s Anubis pieces show that site owners are increasingly trying to make scraping expensive. That’s new-ish social infrastructure for the web and it raises usability questions.

A few disagreements and gentle tensions

Some posts cheer for fewer dependencies and less JS. Rach Smith built a happy little site with modern CSS and minimal JavaScript. Chris Ferdinandi preaches reuse to avoid brittle UI code. They’re saying: keep things small, keep them maintainable.

But Hugo Daniel and the graphics crowd are telling a different truth: sometimes you need heavy lifting. WebGPU isn’t a toy; it’s a big hammer for specific tasks. That means you’ll pay complexity tax — build systems, shader compilers, device quirks.

This is the classic trade-off: use lightweight, resilient tools for most things, but don’t pretend you can do high-performance rendering with templating engines and vanilla CSS alone. Pick your battles.

What I’d tell someone skimming this week’s posts

  • Want to save a personal archive? A short JS script will probably do it. See Bin Wang for a hands-on example.

  • Building graphics? Measure compile times and don’t assume the first frame is representative. Check Hugo Daniel.

  • Need to find files in Node scripts? There’s now fsPromises.glob() in Node. No more reaching for a small npm package for quick hacks. Read Stefan Judis.

  • Repeating HTML and CSS? Refactor into components and shared classes. Your users will thank you. Chris Ferdinandi walks through it.

  • Building a tiny brochure site? Try modern CSS first. Don’t bring a chainsaw to a picnic. Rach Smith makes that case.

  • Feeling nostalgic? There’s a retro music player you can play with, and a full CP/M in the browser if you want to poke old software. See Unlisted Retrograde Holdout and Chris Garrett.

  • Worried about scraping or running bots? Take a look at Anubis and think about the consequences for accessibility and tooling. Michał Sapka has a few posts on it.

Little patterns that keep repeating in the details

  • People keep sharing the dirty parts. The posts are frank about messy HTML, weird vendor limits, and browser quirks. No one pretends their code is pretty. That’s refreshing. It’s like walking into someone’s workshop and seeing the tools scattered on the bench.

  • A small utility can save a day. Bin’s image downloader or Stefan’s tiny note about globbing are micro-optimizations with outsized returns for one-person projects.

  • The web still rewards curiosity. The CP/M emulator wasn’t a job requirement. It was curiosity and stubbornness. That spirit keeps JavaScript interesting.

  • There’s a new etiquette around anti-scraping. Anubis is the neighborhood watch of the web. It slows down aggressive bots. But it also asks visitors to run more JS. That raises questions about who gets excluded.

Little analogies I kept thinking about

  • JavaScript is like duct tape this week: you patch up small things like image archives and emulators with it. It’s messy but effective.

  • WebGPU is a sports car: fun, fast, but you need to respect its appetite for fine fuel (compile time, driver compatibility).

  • Node’s glob joining the core is like getting silverware in the main drawer instead of in a separate box. Small convenience, but it changes the routine.

  • Anubis is speed bumps on a suburban road: great for slowing down dump trucks, sometimes annoying for folks who bike through.

Tiny caveats and curiosities worth following up on

  • Hugo’s interest in cold vs warm compilation is a thread to watch. If you care about the first-run stutter, that’s a niche but impactful thing to measure.

  • The glob API in Node is handy, but check compatibility: not everyone is on Node v22.17 yet. For long-lived scripts you still need to consider who’s running your code.

  • Anubis-style defenses work now, but they might push scraping into even sneakier territory. There’s a cat-and-mouse dynamic here, and it touches law and ethics as much as tech.

  • The retro projects will likely remain hobby pieces, but they’re also great learning toys — and they often have tiny technical gems (like dealing with binary files in the browser) that you can reuse.

Final wandering thought

There’s a modest, pleasing theme here: folks are using JavaScript like a Swiss Army knife. When the problem is a small personal pain, reach for a tiny script. When the problem is specialized — high-performance graphics, emulation — bring the right engineering. When the problem is social — scraping vs access — expect a messy middle.

This week’s posts don’t rewrite any rulebooks. They don’t promise a radical new framework or some silver-bullet library. They’re more like notes left on the fridge: helpful, practical, sometimes a little cranky. If you want more detail, or you like the smell of someone’s workshop, go read the originals. The authors are generous with code and honest about the parts that still need work.

If you find yourself choosing between a quick script or a full rewrite, you’ll probably do the quick script. You probably will. It’s what people do. And that’s always been the secret of the web: small, stubborn hacks that make life slightly better. Read the linked posts if any of those annoyances or curiosities hit home — there’s good practical stuff in the trenches.