Deep rest as slow fade. Senioritis energy. Here & not here.
Plans are made. There’s a lot to trim away & focus on. This will be our first deep rest after our repositioning, and this year we dealt with a lot in the wider world. I’m looking forward to spaciousness. Finessing the pack load. Cooking elaborate meals. Spending time with the dogs.
I always remain deeply grateful for the work I get to do, but this year it hits different. Thanks for all of your support this year. It means the world to us.
Sometimes a thing happens in my industry that is so mind-blindingly stupid that I get troll-excited about it. I hunch my shoulders and grin like a profoundly deranged person, and start to yell about its inherent dumbness to anything with a pulse. I become so thoroughly not-a-fan of the thing that something in my brain snaps and I start to think about it all as a grand joke. While we’re here, I don’t know how my brain works either, thank you for asking.
Take, for example, LLMs that click the thing for you. Used to be that if you wanted to buy something or do something online, you’d have to click the thing. Now you can tell a robot which thing to click, and you can then tell the robot to go off and click the thing, with or without any confirmation. What could possibly go wrong, I laugh, hunched.
This has apparently become enough of a thing that Amazon sued an LLM company.. The thing apparently comparison-shops, buys at a certain price, etc. And in doing so, it scrapes & DOM-parses the heck out of Amazon, logging into Amazon as you, etc. And so of course it will then have access to all of your personal information – including your billing & shipping addresses and lifetime Amazon order history, the latter of which you cannot delete.
You may not want an LLM to know all of this. (If true, note also.) Or you may not trust an LLM to make the right purchasing decisions for you and not, say, buy the wrong thing, or buy the right thing in the wrong size or color, or buy 50,000,000 of the right thing, or ship the right thing to the wrong place. But people seem to not care enough that they are, in fact, using the now-sued LLM tool, presumably because clicking is and always has been tedious & annoying, and they would rather enlist a robot to click for them.
Shopping is a common task online and a lot of people hate doing it, so it makes sense that there would be a custom-tailored LLM for it by now. So let’s go back to that tedious & annoying bit. We literally invented a whole branch of design that is meant to make software less hard to use. Then it got mostly sidelined, software became considerably worse, and now we’re mad that people are trying to route around the mess everyone made so they can get back to their lives. I truly cannot blame them, even if I think they’re doing the right thing in the wrong way!
The lawsuit is interesting. On the one hand, it makes perfect sense that platforms want real people with real agency to be making load-bearing decisions on them. If all Amazon customers used LLM to buy stuff, I’m guessing return rates would spike and customer satisfaction would drop – and Amazon would make way less off of their own in-platform advertising. On the other hand:
I am very excited to see what happens with all of this! My shoulders are hunched. What a time to be alive.
You just read issue #256 of Draft's Letters. You can also browse the full archives of this newsletter.