There was a change in management at my work which means I may be looking for a job sooner rather than later. I’m okay with that, however - I never felt like I was doing anything meaningful for humanity.
Which leads me to ask: do you feel like you’re doing something meaningful at work? To whatever your definition of meaningful is. If so and you don’t mind saying, what do you (or what does your company) do?