Saturday, June 14, 2025

Entry 765: It's Just Life

I can't say it was the best week I've ever had. It seems like when things are going well for me personally, the world at large is on fire, and when things seem (relatively) stable in the world, my own little nook is unsettled. That could just be perception, though, like we each have our own personal level of stability, and we inflate or minimize things in our own mind until we reach that level. That's why small things sometimes tip us over edge, or why we often just start feeling better about things that haven't changed. It's also why some people are constantly living life on the brink and others always seem to be even keel, even though the external circumstances of each person are roughly comparable.

In general, I think my baseline level of anxiety is relatively low -- or maybe I'm just good at coping. Because I actually do worry a lot, about everything, big and small, but then at some point I say, Fuck it, it's just life and get on with my day. But this week definitely pushed me above my norm. It started with the ICE raids and subsequent riots in LA, moved into a new war in the Middle East, took a quick detour into politically-motivated killings in Minnesota, and is ending tonight with a North Korean-style military parade about seven miles from where I currently sit. That's a lot. Oh, and don't forget, the robots are coming for all of our jobs and climate change is still an existential crisis.

It was a double-whammy earlier this week, too, as S was super stressed out for reasons I won't go into (other than to say it was nothing to do directly with me, thankfully), and when your spouse is stressed out it acts as force multiply to your own stress level. S goes to bed a few hours before me, so she's usually in deep sleep REM by the time I'm crawling in to join her, but on Wednesday when I came into the room, I heard those three dreaded words: "I'm still up." It's the worst, because a) it's painful to see someone you care about in distress, b) it's means I'm not getting any sleep any time soon. Even if she doesn't want to dump everything onto me talk things over, even if we're both just lying there quietly, I can feel the stress emanating from her and being absorb by me.

Although, to be fair, I don't think I was getting much sleep that night anyway. Right before bed I was listening to The Bill Simmons Podcast, which is usually relaxing, but he had Chuck Klosterman on, and they ended the discussion talking about AI, and it was extremely grim. I'm not totally convinced AI is going to completely upend society in a negative way, but I'm not not convinced of it either. I'm in the "it's a coin toss" camp, and the thing about coin tosses is that you lose them just as frequently as you win them. What I do know is that from a governmental policy response position, we are absolutely not equipped to handle it. Even if we had the best and the brightest in charge, we might still get it wrong, and we currently have nothing near the best and the brightest. We are at the mercy of the tech companies, and their message seems to be: This thing that we are making is absolutely going to destroy us all, but we have to keep making it, because if we don't China will destroy us all first.*

*It's like the opposite of the joke in Silicon Valley when the duplicitous tech CEO Gavin Belson says "I don't want to live in a world where someone makes the world a better place better than we do."

So, when I couldn't sleep Wednesday night, I thought about what I would do if I could do something about AI, and I came up with three things.

1. Outlaw driverless cars for transporting people or goods. We have drone airplanes that can fly themselves (or be controlled remotely), but commercial flights still need a human pilot in the cockpit (two of them, even). Let's make it the same for cars. It would protect jobs and add an extra layer of security and peace of mind. We can still use self-driving technology, but a human has to physically be in the driver's seat for the duration of the trip.

2. Make it explicitly illegal to make deep fakes of somebody without their permission or without clearly and repeatedly stating that it's not real. There is a thing now in sports social media where you will see a clip of somebody being interviewed, and they are giving strange answers, and you don't know if it's real or an AI-enhanced fake. It's only going to get worse as the technology gets better/more accessible. If something is obviously phony, either because it's clearly somebody acting (like Bad Lip Reading) or because it's labeled as such, then that's fine -- that's satire and should be protected by the First Amendment -- but if it's not, then it should be libel and/or fraud and subject to punishment. And it might be necessary to regulate social media companies for disseminating this stuff as well. Few things are more dystopian to me than living in a world in which nobody knows what's real and what's isn't. It's funny when it's a Nathan Fielder show,* not when it's just life.

*Loved the new season of The Rehearsal, by the way, speaking of humans in the cockpit.

3. Make a law that content creators get paid if their copyrighted material is used to train an AI algorithm. I have no idea how this could be done, but I bet somebody out there could figure it out. Just like an artist gets some money every time their song gets streamed, they should get some money every time an AI algorithm references their work. Like there's AI Spotify, and you give it a prompt "make a hip-hop dance song," and every artist whose work it references to make the song gets half a cent or something. Humans can freely borrow ideas from other humans (we can't help it, anyway); machines should have to pay. 

Alright that's all I got for today. Until next time...

No comments:

Post a Comment