Thursday, November 22, 2012


When we moved back to Lafayette, almost 5½ years ago, I looked out our bedroom window and saw a traffic light. I don’t know exactly why, but that traffic light was very comforting to me. I suppose it stood as a constant, my connection to the world. Even though it was off in the distance, it stood as a reminder that even in the midst of the cornfields, we were not in the middle of nowhere; we were on the edge of somewhere. I looked out at it, almost every night, as it changed from green to amber to red and back. It was my traffic light, and I loved it.

Since that time, a lot has changed, around here. One of the streets that intersects at that light has been widened and combined with another street, necessitating a new and larger-sounding name. The other street has undergone significant construction, including a four-lane bridge over the railroad tracks where we used to have to wait for trains to pass. A factory has gone up on the corner of the two; a warehouse sits next to it; a third large building is now under construction. Pretty soon, the entire stretch will be commercially developed. A bit closer to home, the area that previously consisted of a cornfield and some wetlands is now our children’s elementary school, our new church building, and a still–under-construction residential neighborhood. All this happened in just 5½ short years.

Yet despite all this, my traffic light is still there, still constant. Despite the construction, despite the manufacturing, despite the new schools and churches and homes, despite the wider roads and bridges and even another traffic light that has since been obscured by the new construction, my light is still visible. It still shines every single night, right out my bedroom window, changing from green to amber to red and back. It’s comforting, that continued consistency, and for this I am thankful.

In short, I thank God for small blessings.

Thursday, November 15, 2012

Poor Leah…

Tonight, while Anna and I were watching a show, our six-year-old daughter, Leah, came downstairs, obviously slightly distressed. Anna asked her what was up; Leah responded that she had to go to the bathroom—a rather cryptic statement, since there is a perfectly good bathroom, perhaps ten steps from her room, which she uses all the time. Still, we told her that that was fine and encouraged her to use the downstairs bathroom and get back to bed.

For whatever reason, Leah then proceeded to enter the kitchen—which, for those who have never been in our kitchen, does indeed feature running water, but does not particularly qualify as a bathroom. We asked her why she was going in there, but she didn’t answer, which of course led us to ask her again. This continued for about thirty seconds.

When she finally came back out of the kitchen, Leah again looked slightly confused. And again, we told her to go to the bathroom and head to bead. Her response: to walk right back into the kitchen, where she proceeded to get a paper towel.

When Leah finally emerged from the kitchen, the second time, I tried to snap her out of her apparent somnambulism by having her give me a hug. She did so, but not before accidentally kneeling on Anna’s legs and hurting her, in the process. Leah then gave me the paper towel, for no apparent reason, and with a little prompting, finally made it to the bathroom.

After going to the bathroom, Leah then came back to give us each another hug. We told her to go back to bed, cautioning her to go to her own room, not her sister, Naomi’s, nor her brother, David’s. She walked up the stairs and, less than a minute later, we were relieved to hear her door close. However, when we came upstairs, about ten minutes ago, we found Naomi’s door hanging wide open. While Leah was certainly not in there, we have little question about who did it.

Poor kid.

Thursday, November 8, 2012

Flannelman Seems a Mite Confused

When I was a sophomore in high school, there was a guy in my English class named Jeff Little. Jeff Little often wore flannel shirts and was thus dubbed “Flannelman” by the class clown, Chris Ziegler (whom, I now realize, I hero-worshipped for his ability to make everyone laugh). One day, during our study of Mark Twain’s The Adventures of Huckleberry Finn, we were discussing things that men and women just naturally do differently. Our teacher, Ms. Furia, asked us each to look at our fingernails. In general, the boys all held up our hands, palms facing us, our fingers bent halfway into a fist. The girls, on the other hand, held their hands with palms out and fingers extended. As Ms. Furia explained the difference, I happened to notice Jeff Little suddenly drop his hands to his side and glance around embarrassedly. Chris Ziegler obviously caught it, too, and deadpanned loudly enough for all to hear, “Flannelman seems a mite confused.” Of course, this was met with raucous laughter and colored the rest of the discussion.

This morning, I feel to empathize with Flannelman. As anyone who isn’t living under a rock knows, the sitting President of the United States of America, Barack Obama, was elected to a second term on Tuesday. (I say “narrowly” because he received a mere 50% of the vote, with narrow electoral wins in quite a few states.) Many of us, the 48% of Americans who support his main challenger, Governor Mitt Romney, were crushed by this reality. Yet now, as the Monday-morning quarterbacks now discuss what went wrong, I am left a mite confused, even a bit apathetic, unsure of just where I fit in.

Mitt Romney, of course, is a Republican, the antithesis of Barack Obama’s Democrat. This morning, I read a piece from an ultraconservative web site, Tea Party Nation, which maintains that Romney lost not because the nation is too Liberal, but because Romney is. The author, one Judson Phillips, claims that Romney lost because he is a moderate, just like John McCain in 2008, George W. Bush in 2000 (who lost the popular vote), Bob Dole in 1996, and George H. W. Bush in 1992. (Phillips maintains that the earlier Bush was only elected because he was Reagan’s Vice President, and couldn’t cut it, on his own.) Phillips does not deign to explain why Tea Party–sponsored candidates, most notably Indiana senatorial candidate Richard Mourdock, easily won the Republican Primary but crashed and burned in the actual election, thus giving the Tea Party full credit for the Democrats’ increased control of the Senate.

All this comes back to one thing, for me: a sense of futility that I think is shared by many, many Americans. Like Romney (and decidedly unlike Obama), I am a moderate. I believe there is good in both the Republican and the Democratic Parties, and I have never clicked the “straight party” button while in the proverbial voting booth. I tend think this moderation is a sign of intelligence, a sign of understanding that few things in this world are pure black or white. Just as Flannelman—who, as far as I know, remains your average American, heterosexual male—could look at his hand by extending his fingers, so, I believe, most Americans need not be confined by extremist political ideologies.

Are moderates the silent majority? Are we being dragged, kicking and screaming, by two outspoken extremes? And most importantly, is there any way around it, in a system that requires a candidate to receive a majority vote? As did Flannelman, I feel a mite confused.

Thursday, November 1, 2012


An online friend recently alerted me to an article in Psychology Today entitled “Colorblind Ideology is a Form of Racism.” I looked it over, and the author, Monnica Williams, Ph.D., makes a fairly intelligent-sounding case for her proposition. Unfortunately, at further glance the article boils down to an exercise in circular logic. As I commented on the original Facebook share:
[B]asically, the author’s argument is that we shouldn’t work toward a colorblind society because we don’t live in a colorblind society. By that argument, we shouldn’t work for world peace because we don’t live in a peaceful world; we shouldn’t work to feed the hungry because we don’t live in a world without hunger; we shouldn’t work to educate the masses because we don’t live in a world without uneducated people. In short, if we have a goal, we must abandon it immediately because we live in a world where that goal has not yet been achieved.

How does that make even the slightest amount of sense?

Not surprisingly, my criticism was quickly met with an equally well thought-out response, from one Nick Moor:
You should probably read the article closer. The author is not saying to never work towards a post-racial world, but is instead arguing that adopting a colorblind approach in all things means that one ignores how race negatively impacts large swathes of people, which at the most basic level means you'll probably be poorly equipped to actually challenge racism. If you accept that racial inequality still exists, then ignoring how race affects outcomes and experiences is a poor response to said inequality. The author provides the alternative of multiculturalism.

As so often happens in these discussions, my response is really too long for (and ultimately too limited by) inclusion in a simple Facebook discussion. As such, Nick, I hope you’ll indulge me as I respond to you here.

First of all, Nick, I appreciate the clarification. (That’s an honest thank you, by the way, not a snide remark. The written word is so hard, sometimes!) I actually understand the argument you’ve highlighted, but I really don’t think it applies as much as you do, for one reason:

Colorblindness and multiculturalism are not mutually exclusive.

Let‘s take my own family, for example. I grew up in the suburbs of New York City, in a firmly middle-class family. My ancestry includes Dutch, Irish, English, and Hungarian (and many more, if you continue back far enough)—a pretty multicultural background, yet one that never breaks out of a single, overarching “race”: Caucasian. So far, multiculturalism has absolutely no bearing on racism nor colorblindness, as neither colloquial race nor colorblindness is involved.

So, let’s move on to my wife. She grew up in an impoverished family of seven children, occasioned by her father’s degree program being terminated halfway through his college career. She was born in West Virginia, then moved to rural Indiana when her father got what would turn out to be a short-lived job. Her ancestry is Scottish, Dutch, Irish, and Cherokee—which obviously overlaps a bit with my own ancestry, but also includes a couple more cultures and even an additional “race” (Native American). Sadly, the Cherokee is so muted by the Scottish that you’d never notice it unless you’re particularly familiar with the tribe and its unique, telltale features. Thus, she remains effectively Caucasian. Again, highly multicultural, but the diminution of her Native-American heritage renders colorblindness fairly moot.

Now, how about our children? Two of our three were adopted, so they don’t share the same biological ancestry as we. As such, we actually know very little about their biological heritage. Yet, we still love and honor the families that gave them life. (Our older daughter’s birthmom is actually coming to visit us, today, and I’m currently chatting with her birthfather’s younger sister.) In short, we embrace their birth families as part of our family, an indelible part of our daughters’ respective stories and heritages. This brings in another layer of multiculturalism. However, as all four birthparents that picked us happened to be Caucasian, the colorblindness/race card still hasn’t even been played.

So, let’s spread out a little farther. Four of my wife’s siblings are married. Two of them married other Caucasians, but two did not. Her brother’s wife happens to be three-quarters Mexican and one-quarter Spanish. In contrast to my wife’s and my fairly suburban upbringing, our sister-in-law grew up in the city and spoke American Spanish in the home—a language she has passed on to both my brother-in-law and their children. Obviously, in this case, both race and culture are significantly different, so we’re finally getting somewhere in terms of our discussion.

My wife’s sister also married a non-Caucasian man. Our brother-in-law is a mix of African nationalities and grew up in poverty, in the deep South. He got into college on a football scholarship and was drafted into the NFL right out of college, but an injury preceded his first season and he wound up back in his college town of Lafayette, Indiana (where he met my sister-in-law). Again: different race, different upbringing, different life experience.

The point I’m trying to make is that we’re a fairly multicultural family, and we do recognize the individual contributions of each family member and his or her cultural upbringing. We love and honor each other, including in-laws and nieces and nephews just as much as anyone else. While we may be of different races, we respect and even, to a certain extent, embrace each other’s culture and their stories, despite their difference from our own. Yet at the same time, we ignore the color of each other’s respective skins, just like we ignore each other’s hair color, eye color, etc.; all are ultimately irrelevant to who we truly are.

Yes, people have different life experiences. Yes, there is a so-called “white privilege” that US society and culture affords to those of us “lucky” enough to be born with less melanin in our skin. But simply embracing a non-colorblind, multicultural attitude doesn’t solve this problem; in some cases, foregoing colorblindness actually compounds it! One only need look at the debate over Affirmative Action to see that many “white” people are a lot more aware of “black privilege” than they are, their own. This awareness leads to increased anti-“black” racism and, by extension, increases the preponderance of attitudes that led to “white privilege” in the first place. While the long-term benefits of so-called “reverse discrimination” may be desirable, the short-term drawbacks often undermine its effectiveness. In short, ignoring race often does much more good than attempting to compensate for it.

Dr. Williams cites Dr. Martin Luther King, Jr.’s famous “dream that [his] four little children not be judged by the color of their skin, but by the content of their character,” a dream that I suspect to be shared by everyone involved in this discussion. The problem is that while multiculturalism does help us understand one another, multiculturalism, by itself, does nothing to accomplish this goal. It is only by pairing that multiculturalism with the very colorblindness Dr. Williams eschews, that we can truly create lasting change in our society’s racial relations.

Wednesday, October 24, 2012

The Electoral College

Today’s question comes from a high-school friend of mine, Erin Solej. She asks:
“Jeff, I remember you commenting on the necessity of the electoral college, but I want to understand it better because I don't see how my vote counts with this system. Of course, I will vote, but can you please post your thoughts on this.”

This is seriously a great question, Erin. I hear about this all the time—people don’t understand the Electoral College, and since people tend to fear what they don’t understand, a lot of them want to do away with it. On this point, though, I must respectfully disagree. Knowledge brings familiarity, and frankly, the Electoral College isn’t nearly as bad as some of the more outspoken among us make it out to be. So Erin, thank you so much for your question. Per your request, here are my thoughts:

Back when our nation was still in diapers and the Constitution was still being written, there arose a big controversy between the states regarding representation. Some states—generally the more populous ones, of course—argued that representation should be based on population; others—generally the less populous ones—argued that a group of united states should each get an equal vote. From this argument came “the Great Compromise,” which resulted in our two houses of Congress: the Senate, with its two congresspersons per state; and the House of Representatives, with its one congresspersons per Congressional district.

The point of the Electoral College is to apply that Great Compromise, in the Legislative Branch of the federal government, to the Executive Branch. Thus, Article II, §1 of the United States Constitution reads:
“Each State shall appoint, in such Manner as the Legislature thereof may direct, a Number of Electors, equal to the whole Number of Senators and Representatives to which the State may be entitled in the Congress: but no Senator or Representative, or Person holding an Office of Trust or Profit under the United States, shall be appointed an Elector.”

Electoral College, 1789
So, instead of having a strictly popular vote, the Electoral College consists of two electors per state and one elector per district—identical to the U.S. Congress. As with the original Great Compromise, the states with smaller populations maintain their voice through their allotted two electors, while states with larger populations maintain their larger voice through the other 10, 20, 50, what have you. Since 1961, this method has also included the District of Columbia, which per the 23rd Amendment to the Constitution, receives “A number of electors of President and Vice President equal to the whole number of Senators and Representatives in Congress to which the District would be entitled if it were a State, but in no event more than the least populous State.” (As of this writing, this number is the same as the number of licks it takes to get to the Tootsie Roll center of a Tootsie pop: uh-three.)

Now, are there other reasons for the Electoral College? Sure. Some argue that the Electoral College was also created because the people were generally uneducated—hence, “College”—and especially because communication was so much slower in 1788 than it is today. The argument continues that because we are now more educated and able to communicate instantaneously, the Electoral College should be disbanded. Unfortunately, while these points are certainly valid, they fail to address the Great Compromise, which remains the primary reason for the College’s existence.

Electoral College, 2000
There is also an argument that since the electors are appointed “in such Manner as the Legislature thereof may direct, ” the Electoral College should be reformed per the method originally used in Massachusetts and currently used in Nebraska: the popular vote for the entire state only dictates the first two electors, while the popular vote in each Congressional district dictates the elector sent from that district. This is, of course, perfectly Constitutional, in part because it does not negate the Great Compromise. I would wholeheartedly support such a reform of the entire system, but it’s hard to see how such a change would come about. Most states are controlled by the same party as their winner-take-all status tends to support, so it’s unlikely that they’d give up those extra votes—especially with as many close elections as we’ve had, in recent years.

So, Erin, now we come down to the crux of your quandary: whether or not your vote actually counts. Sadly, in a nation this size, that’s always going to be a tough call. It’s certainly easy to be discouraged when one candidate has a huge lead in your state, knowing that whichever way you personally vote, that candidate will carry all of the state’s electoral votes. I feel the same way, although obviously the candidate with a seemingly insurmountable lead, here in Indiana, is not the candidate who holds one, in New Jersey. But I guess my question is: would it really be that much better, in a true democracy? It’s hard to say. We’d still have hundreds of millions of eligible voters, so it would still be quite easy to get lost in the shuffle—perhaps even more so. For this reason, I suspect we’d be just as discouraged.

Ultimately, the best advice I can give you is to work for a Nebraska-style reform of the Electoral College in New Jersey. It’s fair, it’s Constitutional, and it could theoretically make your vote count just a little bit more than it does now. As for this election, the best you can do is grab a sample ballot for your local voting district—they’re often available online—and study all of the candidates. Don’t content yourself with the two main parties, nor even just those who are running for President. Research the candidates at every level of government, figure out their positions on the issues—especially the ones which are most important to you—and as always, get out and vote for whichever candidates will best represent your interests.

As for the Presidential election: while your vote may not seem to count, quite as much as you’d like, at least you can content yourself with knowing that whichever candidate wins, you’ll be among those many millions immortalized as a number in the popular-vote results on Wikipedia. ;-)

Hope that helps!

Thursday, August 16, 2012

People of Walmart

So yesterday, as I walked through Walmart, I heard a guy yelling at a crying child, in the toy department. As there didn’t seem to be any physical harm occurring, I kept going, but heard part of the conversation, as best as I can recollect:

Man: “Shut up!”

(child keeps crying)

Man: “Shut up, [name]!”

(child keeps crying)


(child gets louder)

Man, slightly calmer: “Look, please shut up and listen to me.”

(child softens, but keeps crying)

Man: “Will you listen to me? Stop crying, and listen to me?”

(child stops crying)

Man: “Look, we’re here to get a toy for [other child]. [Other child] is getting a toy because she did what she was told. You did not do what you were told, right?”

(child whimpers)

Man: “When Mommy told [other child] to go sit in the corner, she did. That’s why she’s getting a toy.”

Um… parenting FAIL?

Monday, August 13, 2012

AppleScript: Set Display Brightness

Apple Thunderbolt DisplayI’ve been using AppleScript for years now, but I’ve never had the opportunity to become really good at it. I can do a few simple things, but once I get beyond that, I pretty much have to turn to the web. So it was, this morning: my office has two large windows in it, so the ambient light varies quite a bit from day to day and hour to hour. I finally decided that I’m sick and tired of adjusting the brightness on three different displays—not to mention hoping that they’re all exactly the same—on a regular basis. So as usual, I turned to the web.

After a few less useful hits, I finally came to an old blog post called “Change Monitor Brightness Using AppleScript.” It was exactly what I needed, with three exceptions:
  1. Since it’s four years old, it hasn’t been updated for Mountain Lion.
  2. It isn’t designed to change multiple displays concurrently.
  3. It doesn’t include the ability to specify the brightness level, on the fly.
Having solved all three of these problems, I post my results, in hopes that it may help someone else:

set brightness_level to (text returned of (display dialog "Set Brightness Level" default answer ".875" buttons {"Cancel", "OK"} default button "OK")) as number

tell application "System Preferences"
set current pane to pane ""
tell application "System Events"
set j to (count windows of process "System Preferences")
repeat with i from 1 to j
set value of slider 1 of group 1 of tab group 1 of window i of process "System Preferences" to brightness_level
end repeat
end tell
end tell

Enjoy! :-)

Tuesday, August 7, 2012

Some Things Should Just Stay Dead

One of the weirdest things about Netflix is how my children are able to dredge up old TV shows that are off the air for a reason. Case in point: Hanna-Barbera’s Godzilla. For those who are mercifully unaware of this show, You can check the Wikipedia page, but I’ll give you my summary of the first episode.

The show seems to revolve around the crew of a small ship, consisting of the following:
  • a white man, who is, of course, in charge. If his white maleness weren’t enough to tip you off to this fact, his name is—seriously—Captain Majors.
  • a white woman named Quinn, which, given the next character, should be particularly humorous to fans of Sealab 2021.
  • a black man whose name I didn’t catch, but who is obviously intelligent because he wears glasses.
  • a white boy, probably about ten years old, with roughly the same haircut as Velma Dinkley.
  • a small, flying, green dinosaur named Godzooky, who seems to serve the same purpose as Scrappy-Doo, i.e. comic relief peppered with occasional furtherance of plot.
At the beginning of the show, a volcano erupts and a big bird sticks his head out. (Notably absent is the presence of any snuffleupagi.) The bird surreptitiously disappears, but the volcanic eruption apparently triggers a tsunami, which threatens our heroes on the boat. A frightened Captain Majors somehow manages to contact Godzilla, who shows up just in time to pick up the boat and hold it above the tsunami because, you know, that’s just the kind of thing Godzilla does, right?

Since our heroes apparently have the combined IQ of a bologna sandwich, they decide to head for the volcano that’s still erupting. There they find a couple of scientists who were stranded there and decide to investigate further by—hey, why not?—going directly into the erupting volcano, via a convenient, perfectly round hole in the side.

Now, having personally been in a simulated house fire—in full firefighting gear, no less—I can say from experience that that is hot. I expect that an active volcano would be even hotter, yet somehow our heroes are able to make their way right up to the side of the pit, wearing no protective gear whatsoever, and while they do comment on the heat, they don’t even break a sweat—a feat Krillin is apparently incapable of mastering, even on the mildest of days.

What they do find, however, is the firebird (or whatever), happily basking in the center of the volcano. Suffice to say, the firebird does eventually leave the volcano and battle Godzilla, who is apparently like the Captain Planet of this series: eponymous, yet only showing up as needed—again, at the request of Captain Majors, who is apparently some sort of long-distance lizard whisperer.

The point is that some shows—and this does apply to shows other than Hanna-Barbera cartoons, although I can’t think of any at the moment—are just really, really bad. The plot is bad, the dialog is bad, the animation (yes, they’re usually animated) is bad; they’re just plain bad, and not in a Michael Jackson–choke-the-chicken sort of way. These shows have gone on to a peaceful death as the world has recognized their badness, and yet thanks to Netflix, they now have new life, available on demand for all who choose to resurrect their glorious ineptitude.

Thanks, Netflix. The world needs more zombie cartoon shows.

Monday, July 30, 2012

Worst… Rip… EV4R

Back when I was a kid, growing up in New Jersey, there were only two radio stations on any of our radar: Z-100 (WHTZ / 100.3 FM) and Power 95 (WPLJ / 95.5 FM). Between the two, Z-100 was by far the better station (IMHO). Part of that was the fact that they gave away something in the realm of $50,000/week in prizes, including some $30,000 in cash. But the other part of it was their morning show, the original Z Morning Zoo. I understand the Zoo is still running, but it lost something with the departure of its lead deejay, Scott Shannon, who the network transferred to California to start up Pirate Radio (also at 100.3 FM). Sadly, Scott didn’t gel with the west-cost crowd and, after only a year or two, was back in New York. Happily for him, when Z-100 wouldn’t give him his old job back, he went to Power 95, made their morning show #1, and has been doing it, ever since.

But I digress.

The point of this post is not to give a history of New York City–area pop/rock radio. The point is that, at the end of each year, Z-100 would release a cassette, available at Sam Goody, featuring highlights from the last year of the Z Morning Zoo. And each year, my parents would buy it for me, which is why I now have volumes 2-8 of the series, spanning from 1986-1992.

But something happened, during those years, something that would change the face of music forever: the rise of the CD. For Christmas 1988, I received my first CD player. As such, when Christmas 1989 came around, my Z Morning Zoo “tape” was actually on CD. I also have the 1991 and 1992 installments of CD, but sadly, there has always been one missing: in 1990, my parents were unable to find the CD—it was apparently sold out—and got me the cassette, instead. (And yes, I did appreciate it. They explained the situation, and having the cassette was certainly better than nothing.)

Of course, 1990 was not only eventful in that my parents couldn’t find the Z Morning Zoo CD for me. Much more important is that 1990 is the year I met and began dating the woman I would eventually marry, my beloved Anna. And as we rang in 1991, I started making mix tapes for her, one of which would actually include some tracks from that fateful cassette. Fast forward a decade or so, and these old mix tapes were quite old and in great disrepair, so I did what any early twenty-first century man would do: I reconstructed them as playlists, in iTunes. But what of those tracks from my old, beat-up cassettes? Unfortunately, there was nothing to do but leave a few gaping holes in the reconstructions.

Fast forward to July 2012. I was going through the old playlists once again, purchasing several tracks from iTunes, to fill in these gaps. But of course, the Z-100 Morning Zoo album from 1990 is not on iTunes—big surprise. So, I checked online, and lo and behold, there it was, used, on Amazon. I grabbed it, and it arrived in the mail, last week.

After several days of sitting on my desk, I finally got around to sticking it in the computer to rip, this morning. As iTunes starts ripping automatically, I could hear it spinning up and being converted to 320kbps MP4s. The sound chimed, all was well, the disc was ejected, and I went to review the fruits of my labors.

There were two tracks.

It was not a mistake.

When the original CD left the factory, Z-100 apparently felt it should be burned not as the dozens of separate tracks included on the disc, nor even the mere fourteen tracks listed on the face of the disc and in the liner notes.

It was burned as “Side One” and “Side Two”.

And so the division begins….

Wednesday, July 25, 2012


This morning, I got a call that Caller ID brought up as “MCGRAW HILL COM.” The woman was barely intelligible, but began a conversation that went something like this:

“Hello, may I speak to [unintelligible] Drake?”

“Yes, this is Mr. Drake.”



“Is this Mr. Drake?”

“Yes, this is Mr. Drake.”

“Oh, okay. Hello, Mr. Drake. I am calling to talk to your about your computer, yes?”


“Yes, well, I am calling to talk to you about your computer, because as you know, there are lots of threats on the Internet, yes?”


“So what I want is to help you with your computer, yes?”


“So now the first thing you are going to need to do is go to your computer, yes?”

“Sure. I’m there.”

“You are at your computer?”


“Okay, Mr. Drake, and now I need you to turn your computer on.”

“It’s on.”

“It’s on?”

“Yes, it’s on.”

“Okay. Now, what I need you to do is look at your keyboard.”


“Yes, now I need you to look at your keyboard, at the lower left, and tell me what key is there.”




“Okay. And what key is next to that?”


“Excuse me?”


“Ah, yes. Do you have an AppleMac?”


“Ah, yes. Thank you very much.” [click]

Gotta love it.

Saturday, June 30, 2012


I’ve noticed, over the years, that the companies that make my wife’s make-up seem to enjoy repeatedly changing the names of their colors, with no indication of why they have done so, much less what the newly renamed colors used to be called. (In fact, I only assume they even have an identical predecessor; I really have no idea.) I suspect the reason for this is that, with the exception of certain brands (e.g. Sei Bella), once an item of make-up has been opened, it can’t be returned nor exchanged. By constantly changing the names and colors, they force women to buy dozens more items than they otherwise would, in an oft-vain attempt to continue to look the way they like.

Men, on the other hand, have it a bit easier. In our society, it’s pretty much accepted that men—or, at least, the great majority of men—don’t wear make-up. As such, we never have to worry whether last week’s “Smokey Ember” is this week’s “Light Obsidian.” But we do have a handful of hygiene products that we do use, which brings me to today’s topic: deodorant.

Perhaps 15 years ago, I discovered Speed Stick’s 24/7 deodorant gel. I opened each of the three or four different scents and was happy with the one whose name was on a blue background. I bought it and have never looked back. Over the years, I’ve continued to buy it—or have I? To be honest, I have no idea. I don’t know what it was called then; I don’t know what it’s called now. I don’t even know if it’s the same scent. For all I know, they’ve spent $25 million on 15 new scents, trying to give it more widespread appeal, and I never even noticed. I just buy the blue one. It’s probably called something like “Cool Rush,” but if they changed the name to “Fresh Fish” or even “All-Natural B.O.,” my eyes would just gloss over the new text and I’d buy it anyway because hey, it’s the blue one.

Am I alone in this? Does anyone else (especially men) notice changes like this, or even care?

Friday, April 13, 2012

Another Tale of a Twelfth-Grade Nothing

Yesterday I spoke about those embarrassing moments from teenage life. Here’s another, even stranger incident from my oft–ill-fated senior year:

Growing up in the metaphorical shadow of the Twin Towers had its advantages. Case in point: each year, two nights before graduation, our high school would rent out an entire ferry from the Circle Line, a sightseeing company that cruises around Manhattan Island, several times a day. The Senior Cruise was—perhaps is—a dinner/dance, once last treat for the graduates before they don their caps and gowns on Friday.

Another nice thing about our school was exam week: exams began the Wednesday before graduation week and continued through the following Wednesday (the same day as the cruise), with Thursday (the day before graduation) reserved for make-up exams. The tests ran from 8:00–10:00 and 10:30–12:30 each day, but were scheduled such that very, very few people ever had more than one per day. Students from all four grades came in for any slots in which they had an exam, and were dismissed when it was done. Thus, we were all left with the great majority of each of these days off.

This brings us to Wednesday afternoon, after the last exams of our high school career were behind us. I was at my best friend, Keith’s, house, where we were discussing that evening’s cruise. In a stunning display of ignorance, Keith and I decided to make a mix tape that we could bring along, just in case the professional DJ wasn’t doing a good enough job. (Really.) We didn’t have any blank tapes, though, so we just recorded over an old one—which would have been fine, except we didn’t have enough time to fill the entire tape and—more importantly—somehow forgot that the previous contents would still be there.

Well, big surprise: that night, the DJ was doing a really good job. Nevertheless, Keith and I decided to request a song for the class: Bon Jovi’s Never Say Goodbye. (Never mind that the DJ almost certainly had that disc, and just as certainly would have played that high-school–graduation anthem.) Keith approached the DJ, cassette in hand, and asked him to play the last song. The DJ took the tape, rewound to the beginning of the last track, and, listening to it with headphones attached to a separate player, asked Keith who it was. “Bon Jovi,” came the response, complete with the eye rolling such a question deserved. The DJ shrugged and put it in the queue; a couple of songs later, we heard the strains of… well, not Bon Jovi. It was, in fact, a demo tape I had made of a song I wrote for my then-girlfriend (now wife), Anna. There was nothing to be done; the song, which no one besides her was ever supposed to hear, was being broadcast for the entire senior class. And, amazingly enough, people were dancing.

I walked around the room in a daze. No one was laughing; no one was holding their ears; in fact, they were all acting like my ridiculously unprofessional track was something they’d heard on the radio, just last week. (I blame the distortion of dance-volume speakers.) As I came to the back of the room, I passed a girl I’d had a crush on for years (which is a bunch of embarrassing stories, in itself). As she sat there on a bench, with some friends, I heard her say, “I really like this song, but I’ve never heard it before! Who is this?”

Sheepishly, I responded to her query: “Actually, it’s me.” I’ll never forget the look of awe on her face.

As I continued my lap around the room, I was walking on air. The girl I’d once pined for now saw me in a new light. The rest of the class thought my demo was a professional song, and would never know the trick I’d inadvertently pulled on them. And then, as the final chords faded, I heard Keith’s voice. He had grabbed the microphone from the DJ and announced, “For those of you who were wondering who that was, it’s Jeff Drake!”

And suddenly, all eyes were on me.

I don’t remember what happened after that; I do know the rest of the night was rather uneventful, except for the part where I was the sole witness to our salutatorian walking smack into the wrong half of a half-open sliding glass door. “You saw nothing,” she warned me. I just smiled and nodded.

I guess other people have those moments, too.

Thursday, April 12, 2012

Tales of a Twelfth-Grade Nothing

I just turned 37, this week, which may or may not have evoked some of the reflections I’ve been having. It’s more likely that it involves my upcoming 20th high school reunion, which I won’t be attending due to scheduling conflicts, just like my 10th. (I actually am interested in going, but ironically enough, we’ll be in my home town for my wife’s high school reunion, then in her home town for my 20th. Nice, huh?)

Anyway, the title of this post should serve to set the stage for what I’m about to write. I was very much a nerd, throughout my school years—still am, to some extent, though that matters much less to a 37-year-old. By high school, I tried to tell myself that I didn’t care what other people thought of me, and occasionally went to great lengths to prove it. But somewhere in the deepest, darkest recesses of my mind, unbeknownst even to me (but probably quite obvious to everyone else), I did care. Even as a senior, when I had finally gained a little cred, I still longed to be included—which led to some interesting events.

I remember our year-end jazz band concert. I played piano and keyboards, but was the only member of the rhythm section (also consisting of a guitarist, bassist, and drummer) who wasn’t also a member of our school’s most popular garage band. I don’t know if they were asked, or if they volunteered, to perform some prelude music before our set, but the point is that they did. And I, as the sole member of the rhythm section who wasn’t a member of their band, decided to join them on stage. I sat behind the piano where no one could see what I was doing—no one but my bandmates, of course—and moved my hands around the keyboard, pretending to play with them. I guess I figured that since the piano is a fairly soft instrument, the audience wouldn’t wonder why they couldn’t hear me over the electric guitar, bass, and drums. I wanted to be cool; instead, some 20 years later, I’m still absolutely amazed by how incredibly lame I was.

Strangely enough, another memory of sorts goes back to that same set of concerts. The next day, we did an abbreviated set for a school assembly. My electric keyboard didn’t have a pedal, but I somehow convinced my MIDI consultant to lend me his. Unfortunately, I didn’t understand how it worked—that it was basically a glorified Boolean toggle switch—and managed to hook it up to my keyboard, backwards. Instead of just disconnecting it and playing as I had always played, I kept it plugged in and tried to pedal backwards. Needless to say, it didn’t go well; I wound up completely messing up a solo, in front of the entire student body.

I remember the day of the AP chemistry exam. It was held at my parents’ church, for some reason, and the rule was that after the exam, students were supposed to return to the school immediately. The unwritten rule, however, was that everyone would go get some lunch and maybe come back for the end of the school day. Of course, I wanted to do this, but despite having gotten an old car for my 17th birthday (a fairly standard practice, in my town), I didn’t yet have a driver’s license, just a permit. (Long story, that.) But Instead of going back, I found a guy who had a license, but no car, and got him to be my licensed-driver chaperone. I barely knew the guy and we actually didn’t get along very well, but it was mutually beneficial, so we went for it. We went to some random restaurant outside of town and then spent an hour or two at the mall. I have no recollection of what, exactly, we did, but how weird was it to be doing this with a tenuous acquaintance? Only my closest inner circle ever knew about that day.

The list could go on and on, but having been part of the “out” crowd throughout high school, I’d be interested to know how common this is. Do most people replay these kinds of stupid, lame decisions, even decades later? Just curious.

Tuesday, April 3, 2012

Hungry for Less

Before I begin, I’ll be the first to admit that I was expecting a lot, going into this. With all the hype surrounding the then-upcoming The Hunger Gamesmovie and several members of my extended family gushing about how great the books are, I figured it would be amazing. Unfortunately, it just… wasn’t.

The Hunger Games is split into three parts: The Tributes, The Games, andThe Victor. (Note: while there are also three novels in the series, I am speaking only of the first novel, which is itself divided into the aforementioned parts.) What’s interesting is that while many books increase in intensity from start to finish, this one is more like a bell curve. Part I (comprising chapters 1-9) begins a bit slowly, as it necessarily lays the groundwork for the story ahead. This, of course, is to be expected, and by the end of Part I, I had come to know and love the main protagonist. As Part II (chapters 10-18) begins, I was finally fully engaged in the story, and I loved every minute of it, constantly clamoring for more.

But then I got to Part III.

I’m not sure what was going through Suzanne Collins’ mind as she wrote the third and concluding part of this novel, but it is, in a word, boring. It is so bad that there were several times I was about to close the book for good, just not waste my time on seeing how it ended. Still, I persevered, confident that since so many people think it’s so great, there had to be something worthwhile at the end, something that would make this tremendous monotony worth it. The problem is: there isn’t.

After slogging through almost 100 pages of boredom, I finally arrived at the climax of the novel, only to be presented with a ridiculous, contrived, final battle that wasn’t even exciting while it happened. And then, once it was finally over and I got to enjoy the results, there were still another 30 pages of slogging boredom before I finally read “END OF BOOK ONE”—as if I’d actually subject myself to two more novels of this.

I realize, of course, that I’m not in the “young adult” target audience, but I enjoy several other novels of that genre. “Young adult” doesn’t necessarily mean “catering to tweenage pop culture,” but the climax was definitely just that.

I’d also like to point out one of my three stars is because The Hunger Games is completely and utterly devoid of adult language and sexual situations, a feature I wish were found in more of modern literature. Even one of my favorite books of all time, Ender’s Game (the movie for which comes out in March 2013), doesn’t have that feature.

So, all in all, a book that sadly failed to impress, but one I might read again, someday. I’d still love to know what other people find so compelling, and maybe I’ll find it on the second time through.

Wednesday, March 7, 2012

Double Standard

First of all, let me be perfectly clear. Rush Limbaugh is an idiot. I don’t like the guy, I never have, and the fact that he recently made some sexist statements doesn’t enamor me any more to his cause. I’m also not a fan of Sarah Palin, whose name will also appear in this post.

Now, that being said, I just received an email that makes a very good point: what’s good for the goose is good for the gander. I honestly don’t have time to run down every one of these statements, but every one I checked is indeed legit. If you happen to find any that aren’t, please let me know. So without any further ado…

Rush Limbaugh's words have given the… left the opening [it] needed, and they have pounced. Rush has apologized. But the radical left will never accept it because they despise him and want him off the air. To the left, this is simply an opportunity to put their attacks on religious liberty in a feminist frame, and an opportunity to try and shut down Limbaugh and end his career. It is all about censorship and hypocrisy.

“The hypocrisy from liberal leaders and the leftist media is astoundingly shocking, intellectually dishonest, and is utterly insulting on so many levels. Below, we will show you just how all of this is true. Their leftist cronies have said far worse, have not been reprimanded, and offer no apologies. Fair warning, there will be graphic language directly quoted.

[plea for financial support to the organization sending the email]

Just how sensitive are liberals to the plight of women, anyway? Let’s see how they react when one of their own savages women in ways Limbaugh would never dream of doing.

“Last night Robert F. Kennedy Jr., publicly on his Twitter account, called United States Senator James Inhofe a ‘prostitute’ and ‘call girl.’

“MSNBC host Ed Schultz called conservative talk host Laura Ingraham a ‘slut.’ Some of the same sponsors who are now pulling out of Rush’s show still support Ed Schultz’s show

“Bill Maher one year ago called Sarah Palin a ‘dumb twat.’ He followed up days later in a Dallas routine and called Palin a ‘c*nt.’ Last July on HBO, Maher said Palin was ‘a bully who sells patriotism like a pimp, and the leader of a strange family of inbred weirdos.’ Last September, Maher said on his show that Palin would have sex with Rick Perry if he [were] black. He also joked recently that Rick Santorum’s wife, Karen Santorum, uses a vibrator in the bathroom while her husband is in the other room. Did we mention that Barack Obama’s Super PAC accepted $1 million from Bill Maher?

“Bill Maher gloated on his show that critics can’t touch him because ‘I don’t have sponsors.’ It’s not about sponsors. It is about principles, and the liberals are absolute hypocrites to not show the utmost disgust for this type of intolerable language. They are essentially saying it is ok if the woman is a conservative

“Talk show host Mike Malloy hoped Sarah Palin ‘drives herself into madness’ and insisted Michele Bachmann is an ‘evil bitch from Hell’ who would have gladly supervised the Holocaust. Montel Williams rooted for Bachmann to slit her own wrist or throat. Randi Rhodes insisted that teenage boys weren’t safe from Palin’s advances if they stayed over at her house

“A couple of years ago, comedian Louis CK ‘joked’ on the Opie and Anthony radio show about Palin coming to the Republican convention ‘holding a baby that just came out of her f***ing, disgusting c*nt, her f-ing retard-making c*nt. I hate her more than anybody.’

“On Twitter, Louis CK attacked Palin in 2011 as a ‘f***ing jackoff c*nt-face jazzy wondergirl’ who ‘has a family of Chinese poor people living in her c*nt hole.’

“Has Louis CK been chastised by his liberal cronies? No, quite the opposite. He will be headlining The Radio and Television Correspondents Dinner in Washington, DC.

“‘We’re very excited about having Louis CK at the dinner,’ said Jay McMichael of CNN, who chairs the Radio and Television Correspondents Association’s executive committee. ‘This is an evening you’ll want to experience. We’re shaking things up, showcasing the unexpected, and delivering lots of laughs.’

“For too long, liberals have been able to get away with this behavior. That ends now. Conservatives, it is time to stand up to this hypocrisy, and you can help us lead the charge.”

Point is, they’re right. Rush may be a jerk, but holding him any more accountable than these other people is just plain ridiculous. Let’s make sure every one of these people are held accountable for their “humor.” Bad taste is bad taste, regardless of the source.

Wednesday, February 22, 2012

Familiarity Breeds Contempt

I remember when I was a kid, a new pizza place opened up in our area. It was about half an hour away, but we’d heard it was really good, so we made the trek to try it. It lived up to the hype: their delectable pan pizza was completely different than the same old genuine New York–style pies at all nine Italian- or Italian-American–owned pizza/sub shops in our tiny New Jersey hamlet of 15,000 people. It was also expensive, but it was so delicious that we’d drive up Route 46, every few months, to take advantage of that wonderful place called “Pizza Hut.”

A few years later, they opened another store—their second, in my young mind—about 10 minutes from our house. Now it was easier to get there, though our weekly pizza night continued to come from locals like Cosmo Bella or Bachagaloop’s. Still, when the craving came, we’d still head out to Totowa to get a Pizza Hut pizza and a side of cheesy garlic bread. Definitely great stuff.

On the other side of the coin were burgers. Sure, we could always go to our local McDonald’s or Burger King; we could even get some really good char-broiled burgers from a local favorite like the Anthony of Wayne or the Hearth. But every once in a while, we’d go to this tiny, beat-up place down on the Clifton/Paterson border, a place so small that they just had two doors connected by an aisle where you could place your order. Still, it was always insanely busy. I don’t remember the line ever fitting inside; it was always out the door and across the parking lot, and Dad would wait in line for 30-45 minutes to get our food while Mom waited in the car with two increasingly bored kids. (Thank goodness for Walkmans.) Despite all this, we’d go back time after time for the delectable taste of White Castle.

The point I’m trying to make is that, as a kid, these now–extremely commonplace restaurants were incredibly special to me, but these days, it’s just not the same. I’m sure some of that comes from the romanticism of childhood, but some of it is the fact that it’s just not special anymore. Ironically, Pizza Hut is still 10 minutes away and White Castle is still 20 (although admittedly, the latter is available from Walmart, a mere three minutes away). But they’re still so incredibly normal that we don’t often take advantage of them.

Last night, for example, my daughter Leah’s school had a fundraiser at the local Pizza Hut. We went—even took the missionaries, since we were scheduled to feed them, anyway—but it just wasn’t the same as when I was a kid. It was still enjoyable; just not special. I attribute this to the number of times Anna and I had Pizza Hut when we were first married and it was convenient food after a long day at work. (We did the same with White Castle, which was much closer, then.) Even though that was 14 years ago, the magic is permanently gone.

So what do you think? Does familiarity really breed contempt, or at least apathy?