Every time I think I have a handle on it, something new happens. This time of year is always bad for me, because the sun has started coming out which is going to make everything far more noticeable. I am sick of hearing people say "oops, did you make a mistake with your sunscreen?" or "you have some dirt on your hand" (um, that's not dirt.. it's just the teeny bit of skin that still has pigment!).
I am so sick of it that I'm contemplating depigmentation therapy, even though I don't think it's spread enough for that yet.
I know this is a common thread on this community.. but I just needed to vent to people who will GET IT. Most of my friends just tell me that they don't notice it (haha) or that it's not like it's skin cancer or bad acne or something (like that makes it better).
Thanks for listening, everyone.