Your Professional Network Sucks and It’s All Your Fault.
29 Jun
Your sitting down eating lunch at your desk. The thought races across your head that it’s about time to get a new job. You aren’t happy with what you have now. You need something new, refreshing and with better career potential.
You then start thinking about next steps. You hate the idea of updating your resume and applying for jobs online. You need something more efficient & something more natural. Maybe you should start telling a few close people what you’re thinking about and get their opinions. Sounds like a great idea. Now, the question is who do you tell? Your friends, colleagues that work with you, your favorite barista at Starbucks?
You need someone who will keep it private but also give you great advice on next steps. Someone who will connect you with the right people. You want to start interviewing as soon as possible, and see what’s out there.
Isn’t this what Linkedin was created for!? You immediately go on Linkedin and start browsing your connections for people that can help you.
Then it dawns on you. That person doesn’t exist in your network. You don’t have that person. Everyone in your network is useless. Well, they aren’t exactly useless – but useless to you.
I hate to be the bearer of bad news here, but it’s all your fault.
Here are five reasons it’s all your fault:
- Everyone in your network are people from your current or past jobs
- You didn’t actively try to meet new people
- You helped zero people within your network. No one owes you any favors
- Remember the email that’s sitting in your email that you forgot about? Trust me, they didn’t forget
- No one knows who you really are besides your previous title’s
How to fix it?
The easy answer: Meet people that could help you in 6 months, not that can help you now. As Wayne Gretsky famously said, “skate where the puck’s going, not where it’s been”
- Coffee meetings, a lot of them. Can’t get away in the morning? Meet them during lunch.
- Meet different people, not just people in your industry.
- Respond back to all those emails that are sitting in your inbox. Offer your help.
- Reconnect with people you haven’t met in a while.
- Be consistent. Do it every single day. It’s exhausting, but it’s worth every second.
=
Follow @RobbieAb and/or subscribe to my blogRelated posts:
It's Not About Being Happy, It's About Not Being MiserableDoes Networking = Not Working?Don't Lie On Your Resume, But Lie Like Hell During Your Exit InterviewComments
- Comments 1 Comment
- Categories jobs, networking
- Author admin
(That was a spontaneous use of the semicolon, demonstrating the strenuous thought that went into the sentence.)
I have a friend who worked as a copy editor in Canada, and whose education was therefore more British than American. She is very fond of the semicolon, and has pioneered her own use of it, replacing the comma in the greeting of a letter, thus:
Dear MaJa;She likes to think of a semicolon as a comma with vibrato. (She plays the viola.) I have never liked vibrato. I like a clear sound, without a lot of throb in it. As a kid, I thought it was tragic that the great violinists all had such debilitating tremors.
As if on cue, to curb my wayward thoughts, last week I received in the mail a copy of a little book put out by an English firm called User design and titled “Punctuation..?” It’s thirty-five pages long, with illustrated examples of twenty-one punctuation marks, including guillemets (French quotation marks; not to be confused with guillemots, which are auks), arranged in alphabetical order. It is especially good on the difference between the colon and the semicolon. The semicolon is listed last in the Table of Contents, but it makes an eloquent appearance earlier, in the entry for the colon: “A semicolon links two balanced statements; a colon explains or unpacks the statement or information before it.”
I think the semicolon is more easily understood if it is defined in relation to the colon rather than to the comma. Under “Semicolon,” the book says, “Its main role is to indicate a separation between two parts of a sentence that is stronger than a comma but less strong than dividing the sentence in two with a full stop…. She looked at me; I was lost for words.”
So the semicolon is exactly what it looks like: a subtle hybrid of colon and comma. Actually, in ancient Greek, the same symbol was used to indicate a question.
And it still seems to have a vestigial interrogative quality to it, a cue to the reader that the writer is not finished yet; she is holding her breath. For example, if the sentence above—“She looked at me; I was lost for words”—occurred as dialogue in a piece that I was copy-editing, I would be tempted to poke in a period and make it into two sentences. In general, people—even people in love—do not speak in flights that demand semicolons. But in this instance I have to admit that without the semicolon something would be lost. With a period, the four words sink at the end: SHE LOOKED at me. The semicolon keeps the words above water: because of that semicolon, something about her look is going to be significant.
The title “Punctuation..?” employs a hybrid of an ellipsis and a question mark, with the point of the question mark doing double duty as the third dot in an ellipsis. To me it looks off balance—a triumph of design over tradition—partly because there is no space between the word and the ellipsis. If I had invented the interrogative ellipsis, I think I’d have gone with “Punctuation???” Or maybe “Punctuation;” it would have the effect of suggesting that you supply your own subtitle.
Read Mary Norris on commas, pencils, diaeresis, swears, the history of The New Yorker, and more.
Photograph: Comstock/Getty.
As almost everybody knows at this point, I have resigned my position at the University of New Mexico. Effective this July, I am working for Google, in their Cambridge (MA) offices.
Countless people, from my friends to my (former) dean have asked “Why? Why give up an excellent [some say 'cushy'] tenured faculty position for the grind of corporate life?”
Honestly, the reasons are myriad and complex, and some of them are purely personal. But I wanted to lay out some of them that speak to larger trends at UNM, in New Mexico, in academia, and in the US in general. I haven’t made this move lightly, and I think it’s an important cautionary note to make: the factors that have made academia less appealing to me recently will also impact other professors. I’m concerned that the US — one of the innovation powerhouses of the world — will hurt its own future considerably if we continue to make educational professions unappealing.
Opportunity to Make a Difference
Ultimately, I got into science in order to make a positive difference on the world. That goal remains, but, for some of the reasons I outline below, it is becoming harder over time. Google is a strong example of an organization that actually is using advanced computer science to make a real, positive difference in the world. While it’s also difficult to make an impact at an immense company like Google, in the current climate it seems like better chances than in academia.
Workload and Family/Life Balance
Immense amounts have been written about this, and I won’t try to reprise them here. Suffice it to say that the professorial life can be grueling, if you try to do the job well, and being post-tenure can actually make it worse. This is a widespread problem in academia, and UNM is no different. But, as of my departure, UNM had still not approved a unified parental or family leave policy for faculty, let alone established consistent policies and support for work/life balance.
Centralization of Authority and Decrease of Autonomy
In my time at UNM, I served under four university presidents, three provosts, and two deans. The consistent pattern of management changes was centralization of control, centralization of resources, and increase of pressure on departments and faculty. This gradually, but quite noticeably, produced implicit and explicit attacks on faculty autonomy, decrease of support for faculty, and increase of uncertainty. In turn, I (and many others) feel that these attacks subvert both teaching and research missions of the university.
Funding Climate
A near-decade of two simultaneous foreign wars, topped off by the most brutal recession in two generations, has left federal and state budgets reeling. Compounding this, the current Republican-led poisonous political climate and Republican-orchestrated congressional melt-down has destroyed any chance of coherent, reasoned budget planning. In the face of these pressures, we have seen at least seven years of flat or declining funding for federal science programs and state legislatures slashing educational funding across the country. Together, these forces are crunching universities, which ultimately turns into additional pressure on faculty. Faculty are being pushed ever harder to achieve higher levels of federal research funding precisely at the time when that funding is ever harder to come by. This turns into policies that hurt the university by putting the teaching mission at odds with the research mission and subjugating both to the quest for the elusive dollar. A recent UNM School of Engineering policy, for example, uses teaching load as a punishment to goad professors into chasing funding. (Indeed, the policy measures research productivity only as a function of dollars brought in. Strangely, research productivity doesn’t enter the picture, let alone creativity.)
Hyper-Specialization, Insularity, and Narrowness of Vision
The economic pressures have also turned into intellectual pressures. When humans feel panicked, we tend to become more conservative and risk-averse — we go with the sure thing, rather than the gamble. The problem is that creativity is all about exploratory risk. The goal is to find new things — to go beyond state-of-the-art and to discover or create things that the world has never seen. It’s a contradiction to simultaneously forge into the unknown and to insist on a sure bet.
Traditionally, in the US, universities have provided a safe home for that kind of exploration, and federal, state, and corporate funding have supported it. (Incidentally, buying advanced research far cheaper than it would be to do it in either industry or government, and insulating those entities from the risk.) The combination has yielded amazing dividends, paying off at many, many times the level of investment.
In the current climate, however, all of these entities, as well as scientists themselves, are leaning away from exploratory research and insisting on those sure bets. Most resources go to ideas and techniques (and researchers) that have proven profitable in the past, while it’s harder and harder to get ideas outside the mainstream either accepted by peer review, supported by the university, or funded by granting agencies. The result is increasingly narrow vision in a variety of scientific fields and an intolerance of creative exploration. (My colleague Kiri Wagstaff, of NASA’s Jet Propulsion Lab, has written an excellent analysis of one facet of this problem within our own field of Machine Learning.)
Poor Incentives
Further, the “publish or perish” and “procure funding or perish” pressures discourage exploration outside one’s own specialty. It’s hard to do exploratory or interdisciplinary research when it is unlikely to yield either novel publications in your own field or new funding streams. (Let alone, say, help students complete their degrees.) But many things that are socially important to do don’t necessarily require novel research in all the participating fields, so there’s a strong disincentive to work on them. As just one example from my own experience: when you can’t get credit for helping to save babies lives, then you know that there’s something seriously wrong in the incentive system.
Mass Production of Education
There’s been a lot of excitement in the media about Stanford’s 100,000+ student computer science courses, MIT’s open-sourced classes, and other efforts at mass, distance-education. In some ways, these efforts really are thrilling — they offer the first truly deep structural change in how we do education in perhaps a thousand years. They offer democratization of education — opening up access to world-class education to people from all over the globe and of diverse economic and social backgrounds. How many Ramanujans might we enable, if only we could get high-quality education to more people?
But I have to sound three notes of caution about this trend.
First, I worry that mass-production here will have the same effect that it has had on manufacturing for over two centuries: administrators and regents, eager to save money, will push for ever larger remote classes and fewer faculty to teach them. Are we approaching a day in which there is only one professor of computer science for the whole US?
Second, I suspect that the “winners win” cycle will distort academia the same way that it has industry and society. When freed of constraints of distance and tuition, why wouldn’t every student choose a Stanford or MIT education over, say, UNM? How long before we see the AT&T, Microsoft, or Google of academia? How long before 1% of the universities and professors garner 99% of the students and resources?
Third, and finally, this trend threatens to kill some of what is most valuable about the academic experience, to both students and teachers. At the most fundamental level, education happens between individuals — a personal connection, however long or short, between mentor and student. Whether it’s personally answering a question raised in class, spending twenty minutes working through a tricky idea in office hours, or spending years of close collaboration in a PhD mentorship relationship, the human connection matters to both sides. It resonates at levels far deeper than the mere conveyance of information — it teaches us how to be social together and sets role models of what it is to perform in a field, to think rigorously, to be professional, and to be intellectually mature. I am terribly afraid that our efforts to democratize the process will kill this human connection and sterilize one of the most joyful facets of this thousand-year-old institution.
Salaries
It has always been the case that academics are paid less than their comparable industry colleagues — often, substantially so. (This is especially so in highly sought fields, such as science, technology, engineering, and math [STEM fields] as well as various health fields, law, and a number of other disciplines.) Traditionally, universities compensate for this with broad intellectual and schedule freedom and the joy of mentoring new generations of students. But all of the trends I have outlined above have cut into those compensations, leaving us underpaid, but with little to show for it in exchange. As one of my colleagues remarked when I announced my departure, “We’re being paid partly in cool. If you take away the cool parts of the job, you might as well go make more money elsewhere.”
Anti-Intellectualism, Anti-Education, and Attacks on Science and Academia
There is a terrifying trend in this country right now of attacking academia, specifically, and free thought and intellectualism, generally. Free thought is painted as subversive, dangerous, elitist, and (strangely) conspiratorial. (That word… I do not think it means what you think it means.) Universities are accused of inefficiency and professors of becoming deadwood after tenure or of somehow “subverting the youth”. (Socrates’s accusers made a similar claim before they poisoned one of the great thinkers of the human race.) Politicians attack science to score points with religious fundamentalists and corporate sponsors.
Some elements of these feelings have always floated through the United States psyche, but in recent years it has risen to the level of a festering, suppurating, gangrenous wound in the zeitgeist of the country. Perhaps those who sling accusations at education have forgotten that the US reshaped millennia of social and economic inequity by leading the way in creating public education in the nineteenth century? Or that education has underlaid the majority of the things that have made this country great — fields in which we have led the world? Art, music, literature, political philosophy, architecture, engineering, science, mathematics, medicine, and many others? That the largest economy in the world rests on (educated) innovation, and that the most powerful military in human history is enabled by technological and engineering fruits of the educational system? That the very bones of the United States — the constitution we claim to hold so dear — was crafted by highly educated political idealists of the Enlightenment, who firmly believed that freedom and a more just society are possible only through the actions of an enlightened and educated population of voters?
Frankly, it’s sickening, not to mention dangerous. If the haters, fearers, and political opportunists have their way, they will gut one of the greatest institutions in human history and, in the process, will cut the throat of this country, draining its lifeblood of future creativity. Other countries will be happy to fill the gap, I’m sure, and pick over the carcass of the country that was once the United States of America.
There are other factors behind my decision, of course. Any life change is too complex to express in a short essay. These are the major ones, though.Nor am I necessarily done with academia forever. I’m going to give the industry track a try for a while, but I could well find myself back in academia in the future. There are certainly many things I still find beautiful and joyful about the job. In the interim, I will look for other ways to contribute to society, other ways to help educate the future, and other ways to change the world.
Drop gender for a moment. Are “non-geeks” worthy of scorn, should they choose to enter the holy grounds of a hotel hosting a con or some other holding of geekdom? Now we’re just talking some very simple ingroup/outgroup stuff. Say you want to be a skinhead, so you buy some Fred Perry, the right boots, clippers, your crucie tattoo, and show up at the show, try to pal around with the regulars. Depending on the area, you… you stand a good chance of being physically beaten. You will certainly not be accepted. They will dislike your rushed, inauthentic attempt to enter an insular subculture. Even if they like you, they might feel that you need an initiation that involves physical violence. Oh, and I forget that I have to keep saying this, but “race” is not a major part of trad skin, so don’t get confused and default to the WP/Neo Nazi stereotype that people are still holding. Separate people. Why did I mention this subculture? Because they are VERY clearly defined. They’re all about being who they are, and all about NOT being who they aren’t. If society happens to cease misunderstanding, misrepresenting and ostracizing them, they will literally dry up and blow away, leaving behind an empty set of boots and braces and the scent of guinness; gone to the great football grounds in the sky.
Geek, though. Probably should be a little bit more open and accepting than that. Should it be so open as to be undefined? Probably not THAT open. I’d say the geekometer (installed at all entrances/exits) should really only flash “intruder alert” if someone attempts to enter without a sense of fervent passion or enthusiam for SOME subject, preferably but not necessarily a subject which is or has been considered uncool by the majority of society. I might go as far as saying that the more popular the subject, the deeper your understanding and interest in it must be to qualify. A “baseball” geek better know WAY TOO MUCH about baseball; a “Sci-Fi” geek can ride on tradition and get by with a dilettante’s quiver of factoids. Everyone can’t be called a geek, or else we need to throw out the word as meaningless (might be worth doing). Liking something doesn’t count. You have to like something to the point where you sacrifice for it. You can sacrifice money, social acceptance, the blank canvas of your uninked body, or a hefty chunk of free time… or anything, as long as you PAID in some currency to show that your love is real. This does not have to be proven and often cannot, but it is something that you will know in your heart of hearts and something that will show itself in time. Credentials do not tell the whole story; I am way too shy and lack the organizational/financial ability to plan a simple-to-everyone-else weekend trip. So, I have never been to a “Con”. Asking someone to show their creds is a hostile move.
Is this gatekeeping silly in a sense? Sure. Just like fraternity hazing, and n00b-shaming, and first-day pranks at a job… but it does serve a purpose for the community. It is ritual. Like it or not, if you form an ingroup, an outgroup forms in the negative space around your artwork. The more strongly defined one is, the more the other is visible in contrast. (See Hokusai: the great wave off Kanagawa). Geek is a pretty lightly sketched, albeit complex pattern, especially now with the popularity of fantasy, horror and sci-fi in the media. Some people here want it to be more clearly delineated. So; are geek-defining, ingroup-forming, outsider-witchhunting and/or gatekeeping good? Partially. It’s very simple. It comes down to quantity vs. quality; quality as defined by presence and strength of passion. Personally, I am not so quick to line up my crosshairs. So some ten-year-old bought a pacman T-shirt, without ever having played Pac-Man in the arcade, or in the awful 2600 conversion. It does not hurt me or diminish my love for old video games. Kind of like how gays marrying each other did NOT destroy my marriage (I did that, myself, heterosexually). So I think that a certain amount of gatekeeping is inevitable, and if done with a wise heart and kind words, can improve a culture. Outsiders can be threats, but they can also be viewed as a vast flock of potential converts, each teeming with potential yield. You like resource-gathering submechanics in RTS titles, right? You like recruiting new members to your party? But, they should be Quality. You do not want to waste your time with loafers, or the dissolute. Balance is all. We risk turning away the golden dragons in the guise of old men should we apply rules too fervently and too often. CERTAINLY we should not be trying to cull half of the human population, based on chromosomal structure. CERTAINLY we should stop alienating people by tug-of-warring them. There is and will be a certain amount of “wow, girls. Let’s be happy that they are even here!”, but that is an archaic reaction. They have been there a long time. They are not novel. They are fellow. They need face no additional challenges based on gender, but they should not be exempt from the minimal, informal, non-invasive policing of the culture as is necessary to ensure a healthy population.
Kelly, and Joe… I decree that you have committed crimes against geekdom; crimes of overreaching, crimes of arrogance. I accept that your motivation was based on love for your culture, but it was a love that became twisted and exclusionary; it was a conditional love. Return to your caves, and remain there for five and twenty days, thinking on your transgressions. When you return, you will be tested. If your vision of geekdom has evolved to become one where gender is not a factor and the heart is guarded without jealousy, you will be welcomed back into the fold and the flames that were snuffed will burn again.