Reading Notes, April 2022
Article: “Can you count on what you measure?”
are the numbers good? They focus on easy-to-gather quantities and neglect any measure of quality.
I just want to stand and clap at everything in here.
Average time on page; bounce rates; sessions with search; page depth etc. Which of these are important for you to know? And for each metric, what number is a sign of success?
If you want to make something transformative, look where nobody else is looking.
In setting any metric it’s important to benchmark where you are, where you want to get to, and by when. This information prevents panic and helps track progress.
Imagine a maze for a minute. Heading towards your “goal” isn’t going to help. In fact, you have to do the opposite to get there. You have to do something your metrics will tell you is wrong: you have to move in a direction that, when measured, looks like failure. You move away from your goal to get to it. How do you justify that? Not everything is as clear cut as numbers make it seem.
Numbers aren’t intrinsically good or bad. They’re just indicators to help you understand a situation and take a sensible course of action. They aren’t written in stone to be slavishly followed forever.
A good set of meaningful metrics should be personal to your situation. The numbers you track should be one of many inputs, both quantitative and qualitative. What you measure will benefit from regular review and should be changed if the measurements no longer help you chart a course into your desired future.
Article: “Life’s a Party, Not a Race”
[people on] Forbes 30 Under […] found their callings at a young age and were able to doggedly pursue them. That is amazing…and rare.
For a lot of us, clarity takes its sweet time […]
If everyone lived from zero to 100 and matured at the same rate, it would be fair to issue sweeping comparisons. But that’s not how it works. We don’t all have the same opportunities. We don’t all take the same paths. We don’t all get the same amount of time.
I love the list of “people who found success after 40 and/or did cool stuff later in life”. Take, for example, Harry Bernstein who published his first memoir at age 96. He wrote two more books and declared: “The 90s were the most productive years of my life.”
In the United States, if you are pregnant over age 35, it’s considered a “geriatric pregnancy.” This is an outdated term — the preferred terminology is “advanced maternal age” — but trust me, the former still makes the rounds.
In France, a pregnancy over age 40 is called a “grossesse tardive” — as a French friend explained, tardive means you’re a bit delayed for something, “like when you’re late for a plane, or late to the party.”
Like the author, I love this recasting of terminology from “you’re old doing this” to “you’re late doing this”.
My good friend recently decided to go back to school in her mid-forties, to pursue a path that always spoke to her, but took a backseat to the more “reasonable” choices she made early in her career. “There’s a part of me that thinks, f*ck, have I wasted the last twenty years?” she laughs. “But the answer is no. I wouldn’t have been as ready for it as I am now. In the end, everything has its time.”
Article: “UA gotta be kidding”
It's hard to overstate just how complex and intertwined [the UA string] is and what astounding amounts of money across the industry have been spent adversarially on ... a string.
Really makes you wonder how divorced from reality our perception of UA string data is from the reality. And the truth is, nobody probably really knows.
Talk: “Superintelligence: The Idea That Eats Smart People”
The way that we've found that's most effective to get interesting behavior out of AIs is to just pour data into them. This creates a dynamic that is really socially harmful. We're on the point of introducing these Orwellian microphones into everybody's house and all that data is going to be used to train neural networks which will then become better and better at listening to what we want to do.
If you think the road to AI goes through this pathway, then you really want to maximize the amount of data that is collected…It reinforces this idea that we have to collect as much data and do as much surveillance as possible.
I always love Maciej’s take on tech.
AI risk is like string theory for programmers. It’s fun to think about, you build these towers of thought and then you climb up into them and pull the ladder up behind you so you’re disconnected from anything. There's no way to put them to the test short of creating the thing which we have no idea how to do.
Article: “Speed Needs Design, or: You can’t delight users you’ve annoyed”
The Web’s size and diversity makes client-side “fast enough” impossible to judge.
The webs size and diversity make the assertion “____ enough” misleading in really any circumstance unless you have more context. Nothing is ever “enough” unless you can say “enough compared to ____”.
when fresh pageloads are fast, you can cheat: who cares about reloading when it’s near-instant?
Hitting refresh is the “have you tried turning it on and off again” of the web. And nobody will care to reboot your web page if the cost is negligible.
So many good nuggets in this series.
Article: “Blogging and the heat death of the universe”
the thing that lasts longest with our websites is probably the part that we spend the least time thinking about—the markup…
This is the second law of thermodynamics made clear on the web: the entropy of any isolated system always increases and, at some point or another, all that’s left of a website is the markup.
Talk: “Haunted by Data”
Data pipelines take on an institutional life of their own. It doesn't really help that people speak about “the data driven business” like they’re talking about “the Christ centered life” in these almost religious tones of fervor.
Great counterpoints to the religion of data.
The promise you’re told is that enough data is going to lead you to insight.
I worry the reason we haven't learned from the fiasco of the 60's, the systems analysis, the fetishizing of data, is because after all it's only anecdoteal. There's only the one data point.
Talking about Erroom’s law, which is Moore's law but backwards for the drug industry (the amount of money 2 cents worth of research could've bought you in the 50’s costs you 1 dollar today, and its exponentially increasing in cost).
The basic fact is that a chain-smoking chemist just randomly shooting compounds into mice is a more cost-effective way to discover drugs than an entire genomics data set. That is a bizarre result.
Speaking about this relationship where you measure the world and then make judgements. Then humans enter the world, see what you're modeling and measuring, and adapt to get around your measurements. So you take into account their cheats and then update your rules.
Notice what you've started to do. Instead of just measuring the world, you're now in this adversarial relationship with another human being and you've introduced issues of power and agency and control that weren't there before. You thought you were getting a better idea of what is happening in the reality, but you've actually just introduced an additional layer between yourself and reality. This kind of thing happens over and over again.
Talk: “Designing Fluid Interfaces”
Characteristics of the physical world make great behvarious.
In the same way that you would design an icon for mass interpretation, leveraging concepts familiar to the most amount of humans possible, you can do the same with non-visual intuitions we share as humans, such as how objects move through time and space.
Everyone we have a shared understanding, or shared intuition, for how a car moves through the world.
We all know intuitively through experience how objects move through the world and how we can manipulate those objects depending on their movement.
you might notice that I haven't used the word duration. We actually like to avoid using duration when we're describing elastic behaviors, because it reinforces this concept of constant dynamic change. The spring is always moving, and it's ready to move somewhere else.
A great talk.
Article: “One startup's quest to take on Chrome and reinvent the web browser”
Everyone at The Browser Company swears there's no Master Plan, or much of a roadmap. What they have is a lot of ideas, a base on which they can develop really quickly, and a deep affinity for prototypes. "You can't just think really hard and design the best web browser," Parrott said. "You have to feel it and put it in front of people and get them to react to it."
Constant prototyping as a strategic advantage: if you have the infrastructure to consistently be trying, iterating on, and delivering new things – along with ever frothing ideas – you open yourself to serendipity and, once something strikes, you’ll have everything in place to deliver it fast and effectively.
Article: “Design System Coverage”
[Look at all these different components.] What variety! And that’s ok! This is the reality of enterprise product design at scale. It reflects the nature of parallel roadmaps, design system team resourcing and bandwidth, business priorities, and many more factors.
A great read, and dose of reality, on design systems.
Some organizations seem to hold up the ideal that, once a design system exists, everything in an interface can and should be built with it. Not only is that an unrealistic goal for most enterprises, but it can often be a toxic mindset that anything less than 100% coverage is misuse of a design system at best or utter failure at worst.
We often use the Pareto principle—often known as the “80/20 rule”—to set an actionable target for teams: aim for up to 80% of any given page to be made of design system components and leave room for 20% of the page to be custom. That remaining 20% is where the invention and innovation can happen. One of our recent clients added some anecdotal and complementary motivation to this: they reported that they spent only 20% of their sprint time creating 80% of their pages with the design system, which then freed up 80% of the sprint time to work on the 20% of custom functionality that really made the experience sing. This is exactly the kind of efficiency that design systems should enable!
This can be a hard thing to get people to understand.
We used to suggest 10% as a starting point, with a plan to work up to 80% eventually, likely over the course of a year or two.
Article: Server-side vs Client-side Analytics
My trust in analytics data is at an all-time low.
Great post by Dave. It’s absolutely wild to me the disparity between data sets that, presumably, are measuring the same thing.
If I, or some hypothetical manager, put too much stock into these metrics I could see it causing a firestorm of reprioritization based on bot traffic. We’d be chasing the tail of a…bot somewhere in Germany.