Let me just come out and say it. Summer sucks.
Now that we are in college, summer just isn't what it used to be - going to the pool, hanging out with all your friends, going to camp and so on. Now, we are adults, and we have to do adult things. Most importantly, we need to find summer jobs.
There are two main paths to take when working in between semesters during the long summer months: find an internship to further your career or find a job that gives you some spending money for the school year.
Why should there be a stigma attached to working a job over the summer that doesn't further your career? Since when do we have to give up our childhood at the age of 18?
I know my parents didn't. My mother was a waitress, and my father worked at a gas station during his summer breaks. Not having "important" internships didn't prevent them from becoming a physical therapist or a doctor, respectively.
Why, in the year 2008, is the only way to get into graduate school working internships? I think all of this is nonsense. But, of course, this is the world we live in, where if you aren't working to eventually make the big bucks in your career, you are wasting your time.