Americans are very keen of coining new terms when old ones become outdated, or somehow don't fit the bill anymore.
Especially in the media nowadays, we try to follow an extremely politically-correct decorum, which sometimes goes beyond the call of duty. When “Hispanic” stopped working, we created 'Latino.” Soon will come the day when we'll have “Southern Latino,” “Northern Latino” and “Classic Latino,” to define citizens of South American, Central American and Latin European countries, respectively.
Terms that could otherwise be inoffensive become taboo, and the English language becomes more and more limited.
Or not.
At the same time we exclude some words from our daily media vocabulary, we automatically create new ones to replace them (after all, we still need words to describe persons and events, no matter the medium). Take, for example, the word “black.”
“Black” is, first of all, a color. Before it was used to describe skin color, it was nothing more than a hue among hues, a very common one, by the way. In design, for example, black can be seen as the addition of all colors, while white simply means having no color at all.
And yet there came the politically-motivated used of the term. When “black” somehow became derogatory to define a group of people, we coined the term “African American.”
It's as if saying the name Barack Hussein Obama automatically made a person a lesser political contender than Hillary Diane Rodham Clinton. I'd hope we're passed that point, and the latest polls, primaries and caucuses seem to show that we are indeed.
But, while the term “African American” is still in vogue, I'd like to propose a new term in the English vocabulary: “Female American.” Every time one thinks of mentioning or writing the term “woman” when referring to a segment of the population, the term “Female American” should be used instead. Especially when talking about current politics.
In a country where being patriotic has long been a staple of any candidate running for any level of public, elected office, being called simply a woman may be a downside to a person. When deciding between an African-American (capital letters included) and a woman, the former may have an unfair advantage in being seen as more American than the latter.
If we can't compare “women” to “blacks,” maybe we shouldn't compare “African Americans” to “women.”
It's not as if women haven't suffered their own share. Women were only allowed to vote in the U.S. less than 90 years ago, with the Nineteenth Amendment; the Equal Pay Act, which ordered men and women to be paid equally, dates back only to 1963; and Roe v. Wade – the still debated abortion rights case – was decided ten years after that, in 1973.
During the Vietnam War, women were asked to work in factories to replace their soldier husbands, just to lose their jobs once men returned. Nowadays, women are allowed to fight side by side with African American and Caucasian men in Iraq and Afghanistan, making them as patriotic as anyone else. Who's then to say they don't deserve to be called “Female Americans?”
Since the U.S. is one of the few countries that prides itself of creating such divisions in society, where people are African Americans, Asian Americans, Native Americans and so on, instead of “just” Americans, it seems just fair that we start calling their minority (which actually represents the majority of the population, with a longer life expectancy, too) by their proper name of “Female American.”
This way, our battle for the democratic nomination would be between an African American and a Female American. It also gives us time to come up with a better term than “Caucasian” for John McCain.
Wednesday, March 5, 2008
Why three (or four or five or...) is better than two
At the same time the current primaries may just be the most exciting of modern times, it also makes it clear that the U.S., despite what many political scientists claim, is indeed a two-party, two-candidate country. And it will be this year more than ever.
It is true: since 1854, when the Republican party was officially created, democrats and republicans have always faced some sort of competition in the presidential elections. Under the flag of whigs, greenbacks, prohibitionists, socialists and others, many have tried to fight the two-party system in this country. And yet, they've all failed.
In the end, the way the political system is currently structured, a candidate that doesn't belong to one of the two major parties in the country has virtually as much chance of becoming president as Dennis Kucinich ever had, after or before dropping out of the race.
First, due to the political power that democrats and republicans have. Allies, as Obama and Clinton have showed, are extremely important, especially in places where voters may be in doubt or well divided. By having a famous political figure by his or her side, a candidate can convince a whole population to follow suit. Aside from the semi-celebrity of Ralph Nader, a third candidate would have almost no chance to find a famous political ally, aside from one or two traitors to the Reds or Blues.
Second, due to pure and simple economics. Different from Romney and Bloomberg, most candidates can't even dream of funding a campaign mostly based on their own money. And an independent candidate wouldn't even dream of amassing the contributions that candidates from the two main parties do.
The third point relates directly to the primaries, and is specially highlighted by the current one: due to their system of nominations, democrats and republicans get an extremely unfair publicity compared to other candidates. While media outlets tend to be balanced between covering democrats and republicans, they're at the same time elevating them to a status that will never be achieved by others.
It's free advertising at its maximum: turn the TV at night, and you will hear, no doubt, about McCain, Huckabee, Obama and Clinton (after all, if you show one, you have to show the other). But what about other (even possible) candidates from other parties?
No one knows who they are, and no one ever will, until it's too late. By late August or early September, virtually most voters will already have made up their minds on which main candidate to pick. And independent candidates won't ever have the chance to gain as much exposure as their mainstream competitors did, even as they battled for their own nomination.
The more debates we have, the more time parties take to define who their main candidate is, the more citizens will know who they are, and the better the chances they will pick one of them, and not an independent. Ralph Nader, a staple of independents, took home less than 0.4 percent of all votes in 2004. All independents together took less than one percent of all votes cast in that election. And there were 15 of them, although no one seems to know that.
This way, the party nomination process becomes a circus where the performers gain, one way or the other, per show. The longer the process takes, the more they will gain, and the least possible it'll be for anyone else to enter the arena.
For those that claim the U.S. is a truly democratic country with a multi-party system, this election may prove that, in times where caucuses and primaries are headline news and presidential debates are primetime material, the same thing that promotes choice may just as well be killing it.
It is true: since 1854, when the Republican party was officially created, democrats and republicans have always faced some sort of competition in the presidential elections. Under the flag of whigs, greenbacks, prohibitionists, socialists and others, many have tried to fight the two-party system in this country. And yet, they've all failed.
In the end, the way the political system is currently structured, a candidate that doesn't belong to one of the two major parties in the country has virtually as much chance of becoming president as Dennis Kucinich ever had, after or before dropping out of the race.
First, due to the political power that democrats and republicans have. Allies, as Obama and Clinton have showed, are extremely important, especially in places where voters may be in doubt or well divided. By having a famous political figure by his or her side, a candidate can convince a whole population to follow suit. Aside from the semi-celebrity of Ralph Nader, a third candidate would have almost no chance to find a famous political ally, aside from one or two traitors to the Reds or Blues.
Second, due to pure and simple economics. Different from Romney and Bloomberg, most candidates can't even dream of funding a campaign mostly based on their own money. And an independent candidate wouldn't even dream of amassing the contributions that candidates from the two main parties do.
The third point relates directly to the primaries, and is specially highlighted by the current one: due to their system of nominations, democrats and republicans get an extremely unfair publicity compared to other candidates. While media outlets tend to be balanced between covering democrats and republicans, they're at the same time elevating them to a status that will never be achieved by others.
It's free advertising at its maximum: turn the TV at night, and you will hear, no doubt, about McCain, Huckabee, Obama and Clinton (after all, if you show one, you have to show the other). But what about other (even possible) candidates from other parties?
No one knows who they are, and no one ever will, until it's too late. By late August or early September, virtually most voters will already have made up their minds on which main candidate to pick. And independent candidates won't ever have the chance to gain as much exposure as their mainstream competitors did, even as they battled for their own nomination.
The more debates we have, the more time parties take to define who their main candidate is, the more citizens will know who they are, and the better the chances they will pick one of them, and not an independent. Ralph Nader, a staple of independents, took home less than 0.4 percent of all votes in 2004. All independents together took less than one percent of all votes cast in that election. And there were 15 of them, although no one seems to know that.
This way, the party nomination process becomes a circus where the performers gain, one way or the other, per show. The longer the process takes, the more they will gain, and the least possible it'll be for anyone else to enter the arena.
For those that claim the U.S. is a truly democratic country with a multi-party system, this election may prove that, in times where caucuses and primaries are headline news and presidential debates are primetime material, the same thing that promotes choice may just as well be killing it.
Albright was (somewhat) right
When former Secretary of State Madeleine Albright announced the creation of the Genocide Prevention Task Force, it was fairly easy to disregard. Recent remarks from a Brazilian singer, however, seem to bring it back to mind.
In an interview to a magazine in the country, celebrated singer Nana Caymmi – the daughter of one of the staples of Brazilian popular music, Dorival Caymmi – complained about the many problems she had had with her son, which she stated was a drug user, and suffering consequences of a recent bike fall. It wasn't so much her personal admission to the media, but the admission that followed that shocked the local media.
"I keep asking myself why I need to suffer so much. I'm not Jewish, I didn't crucify Jesus!," Ms. Caymmi vented.
It's an unusual place for such remark. Since the end of colonial times, when most of the Indian population in the country was exterminated, and the end of slavery in the late 1800s, Brazil has been widely known to be the true melting pot the U.S. always wanted to be. It's the place where every flavor flourishes, instead of becoming blend.
And, despite the nation's 73.6 percent Catholic and 53.7 percent white population, the country never had a true problem with discrimination.
Even during the dictatorship period, there was never the need for a Martin Luther King Jr.-like figure to rise.
While I hope these words will turn out to be meaningless and soon forgotten, the remarks of Ms. Caymmi make one wonder if the same is not being said in other places, by other people who will eventually act on it. History has proven to us the power words have, including hateful words, or those interpreted in wrongful ways.
But, in a way, it also validates the efforts of something like the Genocide Prevention Task Force, where the official goal as its original press release reads is to "respond to emerging threats of
genocide and mass atrocities."
Since its conception, however, this task force seems doomed for its inductive approach.
Made mainly of former government officials not holding any official positions at the moment, including the co-chairs – Ms. Albright and William Cohen, former Secretary of Defense – the group contains only American figures, and state its main strength is to influence the American government to act on possible genocides happening around the world.
Instead of using its political influence to attack the source of the problem, the task force devoted itself to try and influence a single national government. A government that not only will probably have no direct relationship with any possible threat in Darfur or anywhere else, but is entangled in too many foreign causes and spending too much money that Congress of which doesn't approve to care about anything else.
While the intentions of the Genocide Prevention Task Force are commendable, its dependence solely on the American government itself to prevail may be its major birth defect. In special its key figures are American Democrats from the Clinton government. And, even though John Danforth – a former Missouri senator and U.S. ambassador to the United Nations – is part of the task force, the group doesn't seem to have an international reach that'll be good enough to do what it promises.
Being genocide such a worldwide topic, it makes no sense why the group only includes American figures, and focuses only on a government, and a single government while at it.
In an interview to a magazine in the country, celebrated singer Nana Caymmi – the daughter of one of the staples of Brazilian popular music, Dorival Caymmi – complained about the many problems she had had with her son, which she stated was a drug user, and suffering consequences of a recent bike fall. It wasn't so much her personal admission to the media, but the admission that followed that shocked the local media.
"I keep asking myself why I need to suffer so much. I'm not Jewish, I didn't crucify Jesus!," Ms. Caymmi vented.
It's an unusual place for such remark. Since the end of colonial times, when most of the Indian population in the country was exterminated, and the end of slavery in the late 1800s, Brazil has been widely known to be the true melting pot the U.S. always wanted to be. It's the place where every flavor flourishes, instead of becoming blend.
And, despite the nation's 73.6 percent Catholic and 53.7 percent white population, the country never had a true problem with discrimination.
Even during the dictatorship period, there was never the need for a Martin Luther King Jr.-like figure to rise.
While I hope these words will turn out to be meaningless and soon forgotten, the remarks of Ms. Caymmi make one wonder if the same is not being said in other places, by other people who will eventually act on it. History has proven to us the power words have, including hateful words, or those interpreted in wrongful ways.
But, in a way, it also validates the efforts of something like the Genocide Prevention Task Force, where the official goal as its original press release reads is to "respond to emerging threats of
genocide and mass atrocities."
Since its conception, however, this task force seems doomed for its inductive approach.
Made mainly of former government officials not holding any official positions at the moment, including the co-chairs – Ms. Albright and William Cohen, former Secretary of Defense – the group contains only American figures, and state its main strength is to influence the American government to act on possible genocides happening around the world.
Instead of using its political influence to attack the source of the problem, the task force devoted itself to try and influence a single national government. A government that not only will probably have no direct relationship with any possible threat in Darfur or anywhere else, but is entangled in too many foreign causes and spending too much money that Congress of which doesn't approve to care about anything else.
While the intentions of the Genocide Prevention Task Force are commendable, its dependence solely on the American government itself to prevail may be its major birth defect. In special its key figures are American Democrats from the Clinton government. And, even though John Danforth – a former Missouri senator and U.S. ambassador to the United Nations – is part of the task force, the group doesn't seem to have an international reach that'll be good enough to do what it promises.
Being genocide such a worldwide topic, it makes no sense why the group only includes American figures, and focuses only on a government, and a single government while at it.
How low dollar is good for Brazil and the U.S.
I remember when I first came to the U.S., back in 1984. A little child in Brazil, Disney and one-block wide toy stores were the dream.
When I finally arrived, the shopping cart was quickly full, to my parents despair. I didn't care much for the price then, and much less for the exchange rate.
As 2008 starts, a weird sense of déjà vu has hit me.
It turns out, 24 years later, I actually deserve to have that feeling (not that my parents will cover the bill this time). According to the Brazilian Central Bank, the real – the current Brazilian currency, 12nd in its post-colonial History – has almost doubled its value compared to the dollar between 2003 and now. While in early 2003 one dollar was worth 3.53 reals, it's now worth 1.77 reals.
Great news for Brazil. Great news for the United States, too.
When Lula came into office for his first term, on January 1, 2003, many predicted the country would soon explode. The then-perennial presidential candidate had always had a Che Guevarian air of revolution, beard and all. Five years later, however, the country is still standing, aside maybe from a hangar here or there in a São Paulo airport.
Building on the stability left by also two-term Cardoso, Lula helped the country fight inflation and keep a positive commercial balance that has left Brazil with more than $167 billion dollars in foreign reserves at this point. A tiny portion may have come from an Amazon tree or two, as Al Gore would point out, but most of it has come from a solid government strategy carried by leftist and rightist parties alike. Eat this, Republicans and Democrats.
So how is this good for the U.S.?
Simply put, it encourages tourism. From the point of view of a Brazilian living in the U.S., it's been a long time since I last saw so many of my countrymen and women and children visiting the U.S.
With the real so valued against the dollar – the most among Latin countries – more Brazilians can afford the plane ticket, the hotel, and the eight dozen receipts that follow. Mickey Mouse and Best Buy are not covered in gold anymore, and have become part of their reality once again. Some may even buy huge HDTVs, only to find out later that it doesn't work back in Brazil (different broadcasting systems). The more those travelers come, the more money they spend.
Good for one, good for the other. Good deal.
When I finally arrived, the shopping cart was quickly full, to my parents despair. I didn't care much for the price then, and much less for the exchange rate.
As 2008 starts, a weird sense of déjà vu has hit me.
It turns out, 24 years later, I actually deserve to have that feeling (not that my parents will cover the bill this time). According to the Brazilian Central Bank, the real – the current Brazilian currency, 12nd in its post-colonial History – has almost doubled its value compared to the dollar between 2003 and now. While in early 2003 one dollar was worth 3.53 reals, it's now worth 1.77 reals.
Great news for Brazil. Great news for the United States, too.
When Lula came into office for his first term, on January 1, 2003, many predicted the country would soon explode. The then-perennial presidential candidate had always had a Che Guevarian air of revolution, beard and all. Five years later, however, the country is still standing, aside maybe from a hangar here or there in a São Paulo airport.
Building on the stability left by also two-term Cardoso, Lula helped the country fight inflation and keep a positive commercial balance that has left Brazil with more than $167 billion dollars in foreign reserves at this point. A tiny portion may have come from an Amazon tree or two, as Al Gore would point out, but most of it has come from a solid government strategy carried by leftist and rightist parties alike. Eat this, Republicans and Democrats.
So how is this good for the U.S.?
Simply put, it encourages tourism. From the point of view of a Brazilian living in the U.S., it's been a long time since I last saw so many of my countrymen and women and children visiting the U.S.
With the real so valued against the dollar – the most among Latin countries – more Brazilians can afford the plane ticket, the hotel, and the eight dozen receipts that follow. Mickey Mouse and Best Buy are not covered in gold anymore, and have become part of their reality once again. Some may even buy huge HDTVs, only to find out later that it doesn't work back in Brazil (different broadcasting systems). The more those travelers come, the more money they spend.
Good for one, good for the other. Good deal.
A lesson from Bhutto: the world (and the U.S.) still doesn't care
If one good thing can be said to have come from the assassination of Benazir Bhutto is the sad conclusion that we still don't care, or don't know enough to care.
Truth be told, before the murder of the 57-year-old Pakistani former prime minister and opposition leader to the current regime, very few knew who she was, what she meant, and what she could've meant, had her political saga continued as foreseen. Aside maybe from the NY Times' World editor, many would think of the former UN secretary general, and not her, at the mention of the name.
And the saddest part? In a week or two, when Bhutto's buried and her son's named leader of the opposition, we will never hear about Pakistan again, until the next major martyr is dead. Elections take place on January 8, when the Bush government will probably decide what they'll do - or not do - with Pakistan. After that date, newspapers will be once more filled with news from Iraq, news about oil (and Chavez, just in case) and shootings at the street corner market. Oh, and the 2008 presidential elections - together with senatorial, house of representatives and some gubernatorial elections - are still some 10 months away, so there'll still be some coverage on that.
Maybe a blurb here or there will announce what the other 190 countries have been up to. No matter what one may say, foreign countries exist nowadays just to serve our need for extraordinary news. When the pope died, we were interested; when Madrid and London suffered terrorist attacks, we were interested. As Bhutto's assassination is still fresh in our memories, we are still interested. But give it a week or two.
In that sense, President Bush has to be honored. Due to his campaign in Iraq, the top newspapers in the country have seen a continued prominence of international news stories on their front pages unseen since the Vietnam War, or even longer. Between 2004 and 2005, for example, almost every day was Iraq a part of their front page, from good and not-so-good reasons. As a matter of fact, almost three out of every four international news stories on the front page of major papers in the U.S. in those two years focused solely on Iraq, and no one else, a recent study showed. It's been that way, and will probably remain there for at least another year. If it depends on Bhutto's example, maybe we should hope so.
The said realization is that as much as we'd like to say we feel for Pakistan, we don't, simply because we don't know them well enough. It may be the government's fault, our educational system's, our culture's or our parents'. But, as 2008 arrives through the hands of Dick Clarke or Tila Tequila, one can only wonder if we'll see more of the same, or if the U.S. will finally wake up and realize we're not alone.
Truth be told, before the murder of the 57-year-old Pakistani former prime minister and opposition leader to the current regime, very few knew who she was, what she meant, and what she could've meant, had her political saga continued as foreseen. Aside maybe from the NY Times' World editor, many would think of the former UN secretary general, and not her, at the mention of the name.
And the saddest part? In a week or two, when Bhutto's buried and her son's named leader of the opposition, we will never hear about Pakistan again, until the next major martyr is dead. Elections take place on January 8, when the Bush government will probably decide what they'll do - or not do - with Pakistan. After that date, newspapers will be once more filled with news from Iraq, news about oil (and Chavez, just in case) and shootings at the street corner market. Oh, and the 2008 presidential elections - together with senatorial, house of representatives and some gubernatorial elections - are still some 10 months away, so there'll still be some coverage on that.
Maybe a blurb here or there will announce what the other 190 countries have been up to. No matter what one may say, foreign countries exist nowadays just to serve our need for extraordinary news. When the pope died, we were interested; when Madrid and London suffered terrorist attacks, we were interested. As Bhutto's assassination is still fresh in our memories, we are still interested. But give it a week or two.
In that sense, President Bush has to be honored. Due to his campaign in Iraq, the top newspapers in the country have seen a continued prominence of international news stories on their front pages unseen since the Vietnam War, or even longer. Between 2004 and 2005, for example, almost every day was Iraq a part of their front page, from good and not-so-good reasons. As a matter of fact, almost three out of every four international news stories on the front page of major papers in the U.S. in those two years focused solely on Iraq, and no one else, a recent study showed. It's been that way, and will probably remain there for at least another year. If it depends on Bhutto's example, maybe we should hope so.
The said realization is that as much as we'd like to say we feel for Pakistan, we don't, simply because we don't know them well enough. It may be the government's fault, our educational system's, our culture's or our parents'. But, as 2008 arrives through the hands of Dick Clarke or Tila Tequila, one can only wonder if we'll see more of the same, or if the U.S. will finally wake up and realize we're not alone.
Hyperregionalism will destroy the American culture
I recently read an interview in a national magazine from the head of one of the biggest newspaper groups in the U.S. In the interview, he claimed that the solution to save newspapers is to subscribe to what he called "hyper-regionalism" - focusing almost solely in your area, leaving the rest of the world for others to cover.
While this solution may save newspaper, it may destroy American culture. And that worries me.
His plan does make sense economically speaking: by focusing on the local, newspaper can more easily attract advertisers. It's easier to charge less while having more sectoral readers than charge ten times more, and convince an advertiser their market's "somewhere in there."
It explain how more local and special sections appear on papers nowadays more than ever before. What it doesn't show is how, with that, we're killing our own knowledge. International and even national coverage is being related to a distant fourth, with the claim that "if one wants such news, one can find it somewhere else." The problem is, however, it's always
somewhere else.
In a study I have recently conducted among top American newspapers in 2004-2005, three in every four international stories on the front page of said papers focused on Iraq. In other words, we barely showcased that year news from any of the other 191 countries in the world.
It is true: local newspapers can and should give a better coverage of local stories than any other. It's just plain logic: beat reporters should know more about sources, about locations; what to do, who to see, where to go. Editors should have a jump start on figuring out whether or not to cover a specific story. It should not mean, however, that other types of stories should be simply forgotten, relegated or left for others.
One can claim the Internet will provide the missing information. However, those same people claim newspapers are here to stay, no matter what. They'll always be the breakfast table reading. And by increasing more and more the ink devoted to local news stories, and decreasing the rest, one could be damaging the culture and, consequently, the future generations. Many may make fun of pageant competitors who don't seem to know where a foreign county is located in a map. However, it is the harsh reality of our education, and apparently some of our media outlets, who value the local and forget the foreign. Sadly, I fear for the day our kids not only won't locate a foreign country, but won't recognize it either. Iran, China, Hong Kong and Dubai will only be words in a dictionary or in newspapers or web sites read by only a handful that
actually care enough.
While some already claim Americans know a lot about themselves and very little about others, such actions can only add to that. And, unfortunately, this time it may be true.
While this solution may save newspaper, it may destroy American culture. And that worries me.
His plan does make sense economically speaking: by focusing on the local, newspaper can more easily attract advertisers. It's easier to charge less while having more sectoral readers than charge ten times more, and convince an advertiser their market's "somewhere in there."
It explain how more local and special sections appear on papers nowadays more than ever before. What it doesn't show is how, with that, we're killing our own knowledge. International and even national coverage is being related to a distant fourth, with the claim that "if one wants such news, one can find it somewhere else." The problem is, however, it's always
somewhere else.
In a study I have recently conducted among top American newspapers in 2004-2005, three in every four international stories on the front page of said papers focused on Iraq. In other words, we barely showcased that year news from any of the other 191 countries in the world.
It is true: local newspapers can and should give a better coverage of local stories than any other. It's just plain logic: beat reporters should know more about sources, about locations; what to do, who to see, where to go. Editors should have a jump start on figuring out whether or not to cover a specific story. It should not mean, however, that other types of stories should be simply forgotten, relegated or left for others.
One can claim the Internet will provide the missing information. However, those same people claim newspapers are here to stay, no matter what. They'll always be the breakfast table reading. And by increasing more and more the ink devoted to local news stories, and decreasing the rest, one could be damaging the culture and, consequently, the future generations. Many may make fun of pageant competitors who don't seem to know where a foreign county is located in a map. However, it is the harsh reality of our education, and apparently some of our media outlets, who value the local and forget the foreign. Sadly, I fear for the day our kids not only won't locate a foreign country, but won't recognize it either. Iran, China, Hong Kong and Dubai will only be words in a dictionary or in newspapers or web sites read by only a handful that
actually care enough.
While some already claim Americans know a lot about themselves and very little about others, such actions can only add to that. And, unfortunately, this time it may be true.
Welcome to Editorially Speaking, the blog
Welcome to Editorially Speaking. My name's Danny Paskin, by the way. I'm a Ph.D. in international relations, with a Master's and Bachelor's in communication/journalism, and have worked extensively in graphic design. I also used to write an opinion column for the school paper at the University of Miami.
This is a blog devoted to publishing OpEds I have recently written.
The process of submitting those to newspapers is extremely competitive: most newspapers do not accept outside submissions, and those that do many time have their own pool of contributors from which they pick. While submitting a letter is fairly easier - I had one published in the Washington Post in early 2008 - OpEds are a whole other story.
Thus, I've decided that, instead of letting them gather dust in a folder on my hard drive, I should use new media and publish them somewhere. Comments are more than welcome and, if you're in a similar situation and would like to publish your editorial/oped here, I'd be happy to do it.
Thanks and keep reading,
Dr. Danny.
This is a blog devoted to publishing OpEds I have recently written.
The process of submitting those to newspapers is extremely competitive: most newspapers do not accept outside submissions, and those that do many time have their own pool of contributors from which they pick. While submitting a letter is fairly easier - I had one published in the Washington Post in early 2008 - OpEds are a whole other story.
Thus, I've decided that, instead of letting them gather dust in a folder on my hard drive, I should use new media and publish them somewhere. Comments are more than welcome and, if you're in a similar situation and would like to publish your editorial/oped here, I'd be happy to do it.
Thanks and keep reading,
Dr. Danny.
Subscribe to:
Comments (Atom)