Box Back

A record of learning (2025)

Cole Ellison • 2025-02-15

Updated: 2025-03-11

The organization of this page may evolve over time. Perhaps this grouping of sections blindly by content type is less relevant in the face of particular topics of interest.

Generally, I tend to gravitate towards web performance, software design, geometry or mathematically-driven design. I’m currently interested in the local-first movement in app/web development and am experimenting with a small homelab setup.

Reads

Questions the old idea of spending “innovation tokens” and reaping complexity. Instead, this offers the idea of “boundary tokens” which are spent when a project exits the area of what is well and commonly understood by contributors. It shifts the focus to value deep knowledge of one’s chosen tools and puts a higher cost on investing in services outside of that scope due to the maintainability and familiarity concerns.

yes, it’s from 2018, but it’s a nice dive into the render and commit phases of React

Learning requires the ability to fail publicly. While this may be “okay” in groups of only senior+ employees, it becomes much more acceptable and common when instruction is a core part of the job. “Juniors force-multiply seniors, not by writing code, but just by forcing seniors to teach and rethink their knowledge.”

It’s nice to see my general impressions of Bazel (and Nx) reflected elsewhere (and with more backing experience). My time at Grammarly showed me how Nx punishes you for not blindly adopting their entire philosophy on package development, which feels largely based in layers obfuscation, and how Bazel as a part of the JS/TS toolchain was massively cumbersome and unintuitive from Node experience POV.

Turns out at least someone in the industry has known that LeetCode is bad signal for a decade.

Figma Engineering

Related prior/contemporaneous detail on the design and pursuit of the squircle are found in the following: iOS 7 Icon Squircle and Unleashing Genetic Algorithms on the iOS 7 Icon (Mike Swanson)

This one reminded me of Eva Parish’s What I think about when I edit.

The difficulty of measuring productivity

A generative culture is an “organizational culture that is high-trust and emphasizes information flow is predictive of software delivery performance and organizational performance in technology.” The six key aspects of such a culture are: high cooperation, trained messengers, shared risks, encouraged cross-functionality (bridging), allowing failure to invite inquiry, and experimentation with the novel.

The “Core 4” introduces a frame on four pillars (speed, effectiveness, quality, and impact) for engineering org productivity. They take care to emphasize the known negative impact of using developer output metrics at an individual level and gamification of any productivity measurement through incentives. The “paper” itself (can five pages count?) is a brief pitch of this new framework. We used GetDX for a bit at Grammarly (perhaps Squarespace too, though I’m less certain). As a reporting system and interviewing platform, it seemed functional.

Against web bloat

The subsection on incremental decoupling here provides a framework for facilitating painful migrations. Using jQuery is easy, so preventing new uses of it in a draconian way would lead to inevitable animus. Setting up linting on new code and a PR bot to pull in the migration orchestration team on relevant PRs allows for the easy suggestion of alternatives. Stripping functionality out of the version of jQuery used was another good move to prevent regressions.

Talks and Docs

Deals with the inevitability of instability and unknowns in the face of unavoidable complexity incurred as software projects mature. Further reading includes the Laws of Software Engineering (Lehman & Belady) that provides a set of pseudo-axioms describing the forces affecting software systems. Out of the Tar Pit (Moseley & Marks) is also recommended on accidental vs essential complexity.

How to build a learning culture in an engineering org. Somehow, 15 minutes is enough to provide a full framework for the cycle of fostering resilience, growth, and collective education among engineers.

Doing something that looks like the inverse may be good, but nothing is guaranteed. Microservices are another design pattern for systems, and the collective sentiment that they operate as some golden panacea is likely overblown in many instances. I still, generally, believe in smaller services, but the added network overhead has to be justified by traffic data and good architecture. This lead me to Assault by GC from Marc Gravell, which talks about an approach to circumvent GC issues StackExchange was having that caused outages.

The TypeScript team is migrating the TS compiler to GoLang (in 7.0). A Senior Staff engineer once told me that any migration/port of a mature project must carry a minimum of a 10x improvement in performance, reliability, or maintainability (with no degradation in other categories). At the time, this seemed steep, but having seen more examples of modernization projects and the long tail to completion, the ROI for any given project needs to respect both the maturity of the system and the effort related to replacement. Seeing this happen with the TypeScript compiler is exciting, since the change feels like a recognition that TypeScript is really not the best choice for writing a compiler. Tools that allow you to build things often like to dogfood their own tool. “TypeScript is written in TypeScript” is great marketing material, but that style of conceit requires the functionality of the product to keep up with internal complexities and consumer expectations. If you find yourself needing a shoehorn and rib retractor to use your own tool internally, it may be time to invest heavily in dogfooding with modernization work, or just accept that you’ve grown out of the use case for your product. It’s nice to see that the TypeScript team has selected the former.

This is the type of engineering work that initially made me interested in software engineering, though primarily in the abstract, since I never made games in my early projects. This talk goes over three (ish) papers that step towards an elegant solution for determining whether or not two complex objects (made of a composite of convex shapes) are intersecting (with an efficient algorithm). Along the way, it talks about how metaballs are defined mathematically, discusses approaches to vector interpolation, and even touches on voroni diagrams. Good stuff; rather math-heavy.

Most creative software, especially the good iterations, tend to start from a place of passion. As tools become companies and profit becomes more important, the good intentions are (usually) tainted by the need to make shareholders happy rather than users. This leads to price hikes and increasingly antagonistic fee structures that are made to appear cheaper but are not. I don’t know how to solve this, and EndVertex doesn’t necessarily either. This video does, however, catalog the history of this in the context of 3D animation software and provides some thoughts on what could address the most major concerns (maybe).

Tools / Info

No promise that I’ve gone and used these, but I’ve certainly read about them and find them compelling enough to list.

This uses an “immediate mode” (all renders are “pure” and updates wholly replace items) interface for GUI spec rather than “retain mode” (CSS is in this category, as existing elements can be modified). “Immediate mode” libs can reduce the amount of state stored across the application and minimize the amount of recalculation handled outside of their purview. There’s a video from Casey Muratori (blog) (game developer of Molly Rocket, Inc. at the time) from 2005 that explains the difference well.