Comment on car insurance

<- View Parent
dumpsterlid@lemmy.world ⁨7⁩ ⁨months⁩ ago

For the vast majority of Americans having a car is a mandatory part of having a job?

I can’t remember the last job I applied to that didn’t ask specifically whether I had a drivers license and car.

Yes, owning a car is mandatory at least in most places in the US. I don’t like it, but to believe otherwise is a strange distortion of the reality for most.

source
Sort:hotnewtop